(7 months, 2 weeks ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to protect sensitive research at universities from national security threats.
The Government are implementing a range of legislative and non-legislative measures, including the Research Collaboration Advice Team, which provides advice to academia on national security risks in international collaboration. The integrated review refresh committed to review the effectiveness of existing protections. The Department for Science, Innovation and Technology is leading this review, and the Deputy Prime Minister announced last week that the Government will consult on the response in the summer.
I am grateful to my noble friend, but are our universities not compromising their independence by becoming overreliant on China? Some 25% of the students, or 10,000, at UCL are Chinese, which risks the infiltration of academic research and, in the words of the Deputy Prime Minister, coercion, exploitation and vulnerability. While I welcome the recent Statement, what steps will the Government take to replace lost Chinese funding for our universities, so that the UK remains at the forefront of technological research?
I thank my noble friend for the question. The first thing to say is that the independence of universities is absolutely critical to the quality of their research. While the integrated review refresh has of course indicated a great many concerns about working closely with China, and necessitated a reduction of academic collaboration with China, I hope our recent reassociation to the Horizon programme, and a number of other third countries also considering or being very close to associating with Horizon, will go some way towards providing a new pool of collaboration partners in academic research.
My Lords, I am sure that all of us agree with the noble Lord, Lord Young, that we need to protect scientific development from malign actors. But is there not a real problem here—that new technology and advances in scientific knowledge not only require international collaboration, on a scale hitherto unknown, but that most of it, ever since the bow and arrow, is dual-purpose? In other words, it can be used for benevolent or malign reasons. How do the departments charged with this responsibility distinguish between these two, so that in protecting us from the misuse of scientific advances, they are not smothering scientific research as a whole?
The noble Lord is absolutely right in his analysis of the problem, which I agree with wholeheartedly. The most powerful tool we have at our disposal in this is RCAT—the Research Collaboration Advice Team—which provides hundreds of individual items of advice in these areas, where it can actually be quite subtle whether something is dual or single-use or has a military or defence application. It is not something that can be very easily defined up front, and does require a certain wisdom and delicacy of advice to provide that.
My Lords, last week the Statement did not seem to say very much about which actors might be under consideration. The noble Lord, Lord Young, has already mentioned China, but do His Majesty’s Government also think that Iran and other countries might be a problem—not by giving funding, but by researchers and students coming? If that is the case, can His Majesty’s Government really expect universities to vet individuals? Is that not the role for government? I declare my interest as a professor at Cambridge.
The noble Baroness raises a very important point; it is not about naming one or more countries and targeting them. The non-legislative and legislative elements of the entire approach to this are about being actor agnostic, and simply looking at the cases as they arise.
My Lords, further to the points made by my noble friend, the Government said they are taking a range of measures, but if you take an area like biosecurity, which I am sure the Minister will agree is a very significant potential future threat, with people perhaps developing pathogens, aided possibly by using AI technology to do them more easily and quickly, is there not a case for mandatory surveillance over, for example, access to materials, which would indicate where somebody might be trying to do something that has that dual purpose—in other words, something bad rather than something good? Does the Minister agree that a voluntary scheme, such as I understand exists at the moment, may not be enough?
Indeed, and we must recognise that there are limits to a voluntary scheme, particularly where actors are genuinely malign. I reassure the noble Viscount that any research contracted for purposes of defence, or indeed for purposes that might be used for defence, would be subject to vetting in the usual way. Depending on the nature of the research, the greater the vetting.
My Lords, I declare an interest as an honorary fellow of the University of Strathclyde. This challenge to our universities is both fast-moving and intensifying in complexity. Now, the Russell group comprises some universities across the United Kingdom, but not all. Universities UK represents many universities across the United Kingdom, but not all. Is there, or are there plans for, a United Kingdom Government security portal, accessible to all universities across the United Kingdom, for immediate advice and information, if they have concerns?
I thank my noble friend for that. Yes, the university sector absolutely does go far beyond just the Russell group. We must take account of all its needs. The review of protections for higher education and academia is now entering its second phase. There will be consultation on that over the summer. An area it will look at is precisely the mechanics that my noble friend puts forward as to how this kind of transparency can best be delivered with the minimum possible administrative overhead.
My Lords, does the noble Viscount recall that, as long ago as September 2023, his noble friend Lord Johnson of Marylebone, in conjunction with King’s College, produced a report warning about the dangers which the noble Lord, Lord Young of Cookham, mentioned to the House? It called for diversification of the population base of our universities, which had become too reliant on money flowing in from China. Will he also comment on the case that was raised in the media last month of Professor Michelle Shipworth, who was banned from teaching what was called a “provocative” course at a prestigious university, UCL, simply because it might compromise commercial interests—that is, the flow of money from China?
I certainly recognise the concern that overseas undergraduates tend to come very largely from a small number of countries, and the value of diversifying from that. I am afraid I am not familiar with the case the noble Lord mentions. I am very happy to write to him about it. It sounds extremely concerning.
My Lords, upholding national security is the first duty of any Government. To that end, we welcome the Government’s recent briefing for vice-chancellors and the intention to consult on how better to protect UK research from academic espionage. Given the importance of and the likely increase in these threats, does the Minister think it would be reasonable for the Deputy Prime Minister and the Secretary of State to offer similar briefings to their shadow counterparts?
I would be very happy to raise that with them and ask them to do so. I take the noble Baroness’s point. There is nothing more important for us to do than look after our security, and research security is a very serious component of that.
Would the Minister recognise that it is extremely important that his department works closely with the Home Office on this? I noticed last week the warning, from my successor but three at MI5, to vice-chancellors of the threat from Chinese espionage in universities, much of which will be by students under coercion. If I may answer the noble Baroness’s question about who you can go to, there is an organisation but such is my senility that I cannot remember its name. I will look it up. It is connected very closely to MI5, but it is the public-facing organisation to which you go with concerns. It starts “National Protective Security”, I think, but a quick look on my telephone has not revealed the answer, so I will talk to her later. The Minister probably knows the answer, but I am afraid I do not.
I am consulting the lengthy list of acronyms that I wrote down in preparing for this, but I am not sure I have the right one. I take the noble Baroness’s point very seriously. We work extremely closely on this with the Home Office. A number of the legislative provisions keeping our research secure belong to the Home Office and we continue to work closely with it. As to the exact agency she mentioned, I will find out from my officials and write to her.
(7 months, 3 weeks ago)
Grand CommitteeMy Lords, having listened carefully to representations from across the House at Second Reading, I am introducing this amendment to address concerns about the data preservation powers established in the Bill. The amendment provides for coroners, and procurators fiscal in Scotland, to initiate the data preservation process when they decide it is necessary and appropriate to support their investigations into a child’s death, irrespective of the suspected cause of death.
This amendment demonstrates our commitment to ensuring that coroners and procurators fiscal can access the online data they may need to support their investigation into a child’s death. It is important to emphasise that coroners and procurators fiscal, as independent judges, have discretion about whether to trigger the data preservation process. We are grateful to the families, Peers and coroners whom we spoke to in developing these measures. In particular, I thank the noble Baroness, Lady Kidron, who is in her place. I beg to move.
My Lords, it is an unusual pleasure to support the Minister and to say that this is a very welcome amendment to address a terrible error of judgment made when the Government first added the measure to the Bill in the other place and excluded data access for coroners in respect of children who died by means other than suicide. I shall not replay here the reasons why it was wrong, but I am extremely glad that the Government have put it right. I wish to take this opportunity to pay tribute to those past and present at 5Rights and the NSPCC for their support and to those journalists who understood why data access for coroners is a central plank of online safety.
I too recognise the role of the Bereaved Families for Online Safety. They bear the pain of losing a child and, as their testimony has repeatedly attested, not knowing the circumstances surrounding that death is a particularly cruel revictimisation for families, who never lose their grief but simply learn to live with it. We owe them a debt of gratitude for putting their grief to work for the benefit of other families and other children.
My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.
Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words
“or are due to conduct an investigation”
are indeed superfluous.
We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.
There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.
Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.
Let me begin by reiterating my thanks to the noble Baroness, Peers, families and coroners for their help in developing these measures. My momentary pleasure in being supported on these amendments is, of course, tempered by the desperate sadness of the situations that they are designed to address.
I acknowledge the powerful advocacy that has taken place on this issue. I am glad that we have been able to address the concerns with the amendment to the Online Safety Act, which takes a zero-tolerance approach to protecting children by making sure that the buck stops with social media platforms for the content they host. I sincerely hope that this demonstrates our commitment to ensuring that coroners can fully access the online data needed to provide answers for grieving families.
On the point raised by the noble Baroness, Lady Kidron, guidance from the Chief Coroner is likely to be necessary to ensure both that this provision works effectively and that coroners feel supported in their decisions on whether to trigger the data preservation process. Decisions on how and when to issue guidance are a matter for the Chief Coroner, of course, but we understand that he is very likely to issue guidance to coroners on this matter. His office is working with my department and Ofcom to ensure that our processes are aligned. The Government will also work with the regulators and interested parties to see whether any guidance is required to support parents in understanding the data preservation process. Needless to say, I would be more than happy to arrange a meeting with the noble Baroness to discuss the development of the guidance; other Members may wish to join that as well.
Once again, I thank noble Lords for their support on this matter.
My Lords, I now turn to the national underground asset register, which I will refer to as NUAR. It is a new digital map of buried pipes and cables that is revolutionising the way that we install, maintain, operate and repair our buried infrastructure. The provisions contained in the Bill will ensure workers have complete and up-to-date access to the data that they need, when they need it, through the new register. NUAR is estimated to deliver more than £400 million per year of economic growth through increased efficiency, reduced accidental damage and fewer disruptions for citizens and businesses. I am therefore introducing several government amendments, which are minor in nature and aim to improve the clarity of the Bill. I hope that the Committee will be content if I address these together.
Amendment 244 clarifies responsibilities in relation to the licensing of NUAR data. As NUAR includes data from across public and private sector organisations, it involves both Crown and third-party intellectual property rights, including database rights. This amendment clarifies that the role of the Keeper of the National Archives in determining the licence terms for Crown IP remains unchanged. This will require the Secretary of State to work through the National Archives to determine licence terms for Crown data, as was always intended. Amendments 243 and 245 are consequential to this change.
Similarly, Amendment 241 moves the provision relating to the first initial upload of data to the register under new Part 3A to make the Bill clearer, with Amendments 248 and 249 consequential to this change.
Amendment 242 is a minor and technical amendment that clarifies that regulations made under new Section 106B(1) can be made “for or in connection with”—rather than solely “in connection with”—the making of information kept in NUAR available, with or without a licence.
Amendment 247 is another minor and technical amendment to ensure that consistent language is used throughout Schedule 13 and so further improve the clarity of these provisions. These amendments provide clarity to the Bill; they do not change the underlying policy.
Although Amendment 298 is not solely focused on NUAR, this might perhaps be a convenient point for me to briefly explain it to your Lordships. Amendment 298 makes a minor and technical amendment to Clause 154, the clause which sets out the extent of the Bill. Subsection (4) of that clause currently provides that an amendment, repeal or revocation made by the Bill
“has the same extent as the enactment amended, repealed or revoked”.
Subsection (4) also makes clear that this approach is subject to subsection (3), which provides for certain provisions to extend only to England and Wales and Northern Ireland. Upon further reviewing the Bill, we have identified that subsection (4) should, of course, also be subject to subsection (2), which provides for certain provisions to extend only to England and Wales. Amendment 298 therefore makes provision to ensure that the various subsections of Clause 154 operate effectively together as a coherent package.
I now turn to a series of amendments raised by the noble Lord, Lord Clement-Jones. Amendments 240A and 240B relate to new Section 106A, which places a duty on the Secretary of State to keep a register of information relating to apparatus in streets in England and Wales. Section 106A allows for the Secretary of State to make regulations that establish the form and manner in which the register is kept. The Bill as currently drafted provides for these regulations to be subject to the negative procedure. Amendment 240A calls for this to be changed to the affirmative procedure, while Amendment 240B would require the publication of draft regulations, a call for evidence and the subsequent laying before Parliament of a statement by the Secretary of State before such regulations can be made.
I start by thanking the noble Lords, Lord Clement-Jones and Lord Bassam, for their respective replies. As I have said, the Geospatial Commission has been engaging extensively with stakeholders, including the security services, on NUAR since 2018. This has included a call for evidence, a pilot project, a public consultation, focus groups, various workshops and other interactions. All major gas and water companies have signed up, as well as several large telecoms firms.
While the Minister is speaking, maybe the Box could tell him whether the figure of only 33% of asset owners having signed up is correct? Both I and the noble Lord, Lord Bassam, mentioned that; it would be very useful to know.
It did complete a pilot phase this year. As it operationalises, more and more will sign up. I do not know the actual number that have signed up today, but I will find out.
NUAR does not duplicate existing commercial services. It is a standardised, interactive digital map of buried infrastructure, which no existing service is able to provide. It will significantly enhance data sharing and access efficiency. Current services—
I am concerned. We get the principle behind NUAR, but is there an interface between NUAR and this other service—which, on the face of it, looks quite extensive—currently in place? Is there a dialogue between the two? That seems to be quite important, given that there is some doubt over NUAR’s current scope.
I am not sure that there is doubt over the current scope of NUAR; it is meant to address all buried infrastructure in the United Kingdom. LSBUD does make extensive representations, as indeed it has to parliamentarians of both Houses, and has spoken several times to the Geospatial Commission. I am very happy to commit to continuing to do so.
My Lords, the noble Lord, Lord Bassam, is absolutely right to be asking that question. We can go only on the briefs we get. Unlike the noble Lord, Lord Bassam, I have not been underground very recently, but we do rely on the briefings we get. LSBUD is described as a
“sustainably-funded UK success story”—
okay, give or take a bit of puff—that
“responds to most requests in 5 minutes or less”.
It has
“150+ asset-owners covering nearly 2 million km and 98% of high-risk assets—like gas, electric, and fuel pipelines”.
That sounds as though we are in the same kind of territory. How can the Minister just baldly state that NUAR is entirely different? Can he perhaps give us a paragraph on how they differ? I do not think that “completely different” can possibly characterise this relationship.
As I understand it, LSBUD services are provided on a pdf, on request. It is not interactive; it is not vector-based graphics presented on a map, so it cannot be interrogated in the same way. Furthermore, as I understand it—and I am happy to be corrected if I am misstating—LSBUD has a great many private sector asset owners, but no public sector data is provided. All of it is provided on a much more manualised basis. The two services simply do not brook comparison. I would be delighted to speak to LSBUD.
My Lords, we are beginning to tease out something quite useful here. Basically, NUAR will be pretty much an automatic service, because it will be available online, I assume, which has implications on data protection, on who owns the copyright and so on. I am sure there are all kinds of issues there. It is the way the service is delivered, and then you have the public sector, which has not taken part in LSBUD. Are those the two key distinctions?
Indeed, there are two key distinctions. One is the way that the information is provided online, in a live format, and the other is the quantity and nature of the data that is provided, which will eventually be all relevant data in the United Kingdom under NUAR, versus those who choose to sign up on LSBUD and equivalent services. I am very happy to write on the various figures. Maybe it would help if I were to arrange a demonstration of the technology. Would that be useful? I will do that.
Unlike the noble Lord, Lord Bassam, I do not have that background in seeing what happens with the excavators, but I would very much welcome that. The Minister again is really making the case for greater co-operation. The public sector has access to the public sector information, and LSBUD has access to a lot of private sector information. Does that not speak to co-operation between the two systems? We seem to have warring camps, where the Government are determined to prove that they are forging ahead with their new service and are trampling on quite a lot of rights, interests and concerns in doing so—by the sound of it. The Minister looks rather sceptical.
I am not sure whose rights are being trampled on by having a shared database of these things. However, I will arrange a demonstration, and I confidently state that nobody who sees that demonstration will have any cynicism any more about the quality of the service provided.
All I can say is that, in that case, the Minister has been worked on extremely well.
In addition to the situation that the noble Lord, Lord Bassam, described, I was braced for a really horrible situation, because these things very often lead to danger and death, and there is a very serious safety argument to providing this information reliably and rapidly, as NUAR will.
My Lords, it took them half a day to discover where the hole had gone and what the damage was. The water flooded several main roads and there were traffic delays and the rest. So these things are very serious. I was trying to make a serious point while being slightly frivolous about it.
No, indeed, it is a deeply serious point. I do not know the number off the top of my head but there are a number of deaths every year as a result of these things.
As I was saying, a thorough impact assessment was undertaken for the NUAR measures, which received a green rating from the Regulatory Policy Committee. Impacts on organisations that help facilitate the exchange of data related to assets in the street were included in the modelling. Although NUAR could impact existing utility—
I cannot resist drawing the Minister’s attention to the story in today’s Financial Times, which reports that two major water companies do not know where their sewers are. So I think the impact is going to be a little bit greater than he is saying.
I saw that story. Obviously, regardless of how they report the data, if they do not know, they do not know. But my thought was that, if there are maps available for everything that is known, that tends to encourage people who do not know to take better control of the assets that they manage.
A discovery project is under way to potentially allow these organisations—these alternative providers—to access NUAR data; LSBUD has been referenced, among others. It attended the last three workshops we conducted on this, which I hope could enable it to adapt its services and business models potentially to mitigate any negative impacts. Such opportunities will be taken forward in future years should they be technically feasible, of value, in the public interest and in light of the views of stakeholders, including asset owners.
A national underground asset register depends on bringing data together from asset owners on to a single standardised database. This will allow data to be shared more efficiently than was possible before. Asset owners have existing processes that have been developed to allow them to manage risks associated with excavations. These processes will be developed in compliance with existing guidance in the form of HSG47. To achieve this, those working on NUAR are already working closely with relevant stakeholders as part of a dedicated adoption group. This will allow for a safe and planned rollout of NUAR to those who will benefit from it.
Before the Minister’s peroration, I just want to check something. He talked about the discovery project and contact with the industry; by that, I assume he was talking about asset owners as part of the project. What contact is proposed with the existing company, LinesearchbeforeUdig, and some of its major supporters? Can the Government assure us that they will have greater contact or try to align? Can they give greater assurance than they have been able to give today? Clearly, there is suspicion here of the Government’s intentions and how things will work out. If we are to achieve this safety agenda—I absolutely support it; it is the fundamental issue here—more work needs to be done in building bridges, to use another construction metaphor.
As I said, the Government have met the Geospatial Commission many times. I would be happy to meet it in order to help it adapt its business model for the NUAR future. As I said, it has attended the last three discovery workshops, allowing this data.
I close by thanking noble Lords for their contributions. I hope they look forward to the demonstration.
My Lords, I support this probing amendment, Amendment 251. I thank all noble Lords who have spoken. From this side of the Committee, I say how grateful we are to the noble Lord, Lord Arbuthnot, for all that he has done and continues to do in his campaign to find justice for those sub-postmasters who have been wronged by the system.
This amendment seeks to reinstate the substantive provisions of Section 69 of PACE, the Police and Criminal Evidence Act 1984, revoking this dangerous assumption. I would like to imagine that legislators in 1984 were perhaps alert to the warning in George Orwell’s novel Nineteen Eighty-Four, written some 40 years earlier, about relying on an apparently infallible but ultimately corruptible technological system to define the truth. The Horizon scandal is, of course, the most glaring example of the dangers of assuming that computers are always right. Sadly, as hundreds of sub-postmasters have known for years, and as the wider public have more recently become aware, computer systems can be horribly inaccurate.
However, the Horizon system is very primitive compared to some of the programs which now process billions of pieces of our sensitive data every day. The AI revolution, which has already begun, will exponentially accelerate the risk of compounded errors being multiplied. To take just one example, some noble Lords may be aware of the concept of AI hallucinations. This is a term used to describe when computer models make inaccurate predictions based on seeing incorrect patterns in data, which may be caused by incomplete, biased or simply poor-quality inputs. In an earlier debate, the noble Viscount, Lord Younger of Leckie, said that account information notices will be decided. How will these decisions be made? Will they be made by individual human beings or by some AI-configured algorithms? Can the Minister share with us how such decisions will be taken?
Humans can look at clouds in the sky or outlines on the hillside and see patterns that look like faces, animals or symbols, but ultimately we know that we are looking at water vapour or rock formations. Computer systems do not necessarily have this innate common sense—this reality check. Increasingly, we will depend on computer systems talking to each other without any human intervention. This will deliver some great efficiencies, but it could lead to greater injustices on a scale which would terrify even the most dystopian science fiction writers. The noble Baroness, Lady Kidron, has already shared with us some of the cases where a computer has made errors and people have been wronged.
Amendment 251 would reintroduce the opportunity for some healthy human scepticism by enabling the investigation of whether there are reasonable grounds for questioning information in documents produced by a computer. The digital world of 2024 depends more on computers than the world of Nineteen Eighty-Four in actual legislation or in an Orwellian fiction. Amendment 251 enables ordinary people to question whether our modern “Big Brother” artificial intelligence is telling the truth when he or it is watching us. I look forward to the Minister’s responses to all the various questions and on the current assumption in law that information provided by the computer is always accurate.
My Lords, I recognise the feeling of the Committee on this issue and, frankly, I recognise the feeling of the whole country with respect to Horizon. I thank all those who have spoken for a really enlightening debate. I thank the noble Baroness, Lady Kidron, for tabling the amendment and my noble friend Lord Arbuthnot for speaking to it and—if I may depart from the script—his heroic behaviour with respect to the sub-postmasters.
There can be no doubt that hundreds of innocent sub-postmasters and sub-postmistresses have suffered an intolerable miscarriage of justice at the hands of the Post Office. I hope noble Lords will indulge me if I speak very briefly on that. On 13 March, the Government introduced the Post Office (Horizon System) Offences Bill into Parliament, which is due to go before a Committee of the whole House in the House of Commons on 29 April. The Bill will quash relevant convictions of individuals who worked, including on a voluntary basis, in Post Office branches and who have suffered as a result of the Post Office Horizon IT scandal. It will quash, on a blanket basis, convictions for various theft, fraud and related offences during the period of the Horizon scandal in England, Wales and Northern Ireland. This is to be followed by swift financial redress delivered by the Department for Business and Trade.
On the amendment laid by the noble Baroness, Lady Kidron—I thank her and the noble Lords who have supported it—I fully understand the intent behind this amendment, which aims to address issues with computer evidence such as those arising from the Post Office cases. The common law presumption, as has been said, is that the computer which has produced evidence in a case was operating effectively at the material time unless there is evidence to the contrary, in which case the party relying on the computer evidence will need to satisfy the court that the evidence is reliable and therefore admissible.
This amendment would require a party relying on computer evidence to provide proof up front that the computer was operating effectively at the time and that there is no evidence of improper use. I and my fellow Ministers, including those at the MoJ, understand the intent behind this amendment, and we are considering very carefully the issues raised by the Post Office cases in relation to computer evidence, including these wider concerns. So I would welcome the opportunity for further meetings with the noble Baroness, alongside MoJ colleagues. I was pleased to hear that she had met with my right honourable friend the Lord Chancellor on this matter.
We are considering, for example, the way reliability of evidence from the Horizon system was presented, how failures of investigation and disclosure prevented that evidence from being effectively challenged, and the lack of corroborating evidence in many cases. These issues need to be considered carefully, with the full facts in front of us. Sir Wyn Williams is examining in detail the failings that led to the Post Office scandal. These issues are not straightforward. The prosecution of those cases relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was that the Post Office chose to withhold the fact that the computer evidence itself was wrong.
This amendment would also have a significant impact on the criminal justice system. Almost all criminal cases rely on computer evidence to some extent, so any change to the burden of proof would or could impede the work of the Crown Prosecution Service and other prosecutors.
Although I am not able to accept this amendment for these reasons, I share the desire to find an appropriate way forward along with my colleagues at the Ministry of Justice, who will bear the brunt of this work, as the noble Lord, Lord Clement-Jones, alluded to. I look forward to meeting the noble Baroness to discuss this ahead of Report. Meanwhile, I hope she will withdraw her amendment.
Can the Minister pass on the following suggestion? Paul Marshall, who has been mentioned by all of us, is absolutely au fait with the exact procedure. He has experience of how it has worked in practice, and he has made some constructive suggestions. If there is not a full return to Section 69, there could be other, more nuanced, ways of doing this, meeting the Minister’s objections. But can I suggest that the MoJ has contact with him and discusses what the best way forward would be? He has been writing about this for some years now, and it would be extremely useful, if the MoJ has not already engaged with him, to do so.
It may have already done so, but I will certainly pass that on.
I thank everyone who spoke and the Minister for the offer of a meeting alongside his colleagues from the MoJ. I believe he will have a very busy diary between Committee and Report, based on the number of meetings we have agreed to.
However, I want to be very clear here. We have all recognised that the story of the Post Office sub-postmasters makes this issue clear, but it is not about the sub-postmasters. I commend the Government for what they are doing. We await the inquiry with urgent interest, and I am sure I speak for everyone in wishing the sub-postmasters a fair settlement—that is not in question. What is in question is the fact that we do not have unlimited Lord Arbuthnots to be heroic about all the other things that are about to happen. I took it seriously when he said not one moment longer: it could be tomorrow.
My Lords, I am pleased that we were able to sign this amendment. Once again, the noble Baroness, Lady Kidron, has demonstrated her acute ability to dissect and to make a brilliant argument about why an amendment is so important.
As the noble Lord, Lord Clement-Jones, and others have said previously, what is the point of this Bill? Passing this amendment and putting these new offences on the statute book would give the Bill the purpose and clout that it has so far lacked. As the noble Baroness, Lady Kidron, has made clear, although it is currently an offence to possess or distribute child sex abuse material, it is not an offence to create these images artificially using AI techniques. So, quite innocent images of a child—or even an adult—can be manipulated to create child sex abuse imagery, pornography and degrading or violent scenarios. As the noble Baroness pointed out, this could be your child or a neighbour’s child being depicted for sexual gratification by the increasingly sophisticated AI creators of these digital models or files.
Yesterday’s report from the Internet Watch Foundation said that a manual found on the dark web encourages “nudifying” tools to remove clothes from child images, which can then be used to blackmail them into sending more graphic content. The IWF reports that the scale of this abuse is increasing year on year, with 275,000 web pages containing child sex abuse being found last year; I suspect that this is the tip of the iceberg as much of this activity is occurring on the dark web, which is very difficult to track. The noble Baroness, Lady Kidron, made a powerful point: there is a danger that access to such materials will also encourage offenders who then want to participate in real-world child sex abuse, so the scale of the horror could be multiplied. There are many reasons why these trends are shocking and abhorrent. It seems that, as ever, the offenders are one step ahead of the legislation needed for police enforcers to close down this trade.
As the noble Baroness, Lady Kidron, made clear, this amendment is “laser focused” on criminalising those who are developing and using AI to create these images. I am pleased to say that Labour is already working on a ban on creating so-called nudification tools. The prevalence of deepfakes and child abuse on the internet is increasing the public’s fear of the overall safety of AI, so we need to win their trust back if we are to harness the undoubted benefits that it can deliver to our public services and economy. Tackling this area is one step towards that.
Action to regulate AI by requiring transparency and safety reports from all those at the forefront of AI development should be a key part of that strategy, but we have a particular task to do here. In the meantime, this amendment is an opportunity for the Government to take a lead on these very specific proposals to help clean up the web and rid us of these vile crimes. I hope the Minister can confirm that this amendment, or a government amendment along the same lines, will be included in the Bill. I look forward to his response.
I thank the noble Baroness, Lady Kidron, for tabling Amendment 291, which would create several new criminal offences relating to the use of AI to collect, collate and distribute child abuse images or to possess such images after they have been created. Nobody can dispute the intention behind this amendment.
We recognise the importance of this area. We will continue to assess whether and what new offences are needed to further bolster the legislation relating to child sexual abuse and AI, as part of our wider ongoing review of how our laws need to adapt to AI risks and opportunities. We need to get the answers to these complex questions right, and we need to ensure that we are equipping law enforcement with the capabilities and the powers needed to combat child sexual abuse. Perhaps, when I meet the noble Baroness, Lady Kidron, on the previous group, we can also discuss this important matter.
However, for now, I reassure noble Lords that any child sex abuse material, whether AI generated or not, is already illegal in the UK, as has been said. The criminal law is comprehensive with regard to the production and distribution of this material. For example, it is already an offence to produce, store or share any material that contains or depicts child sexual abuse, regardless of whether the material depicts a real child or not. This prohibition includes AI-generated child sexual abuse material and other pseudo imagery that may have been AI or computer generated.
We are committed to bringing to justice offenders who deliberately misuse AI to generate child sexual abuse material. We demonstrated this as part of the road to the AI Safety Summit, where we secured agreement from NGO, industry and international partners to take action to tackle AI-enabled child sexual abuse. The strongest protections in the Online Safety Act are for children, and all companies in scope of the legislation will need to tackle child sexual abuse material as a priority. Applications that use artificial intelligence will not be exempt and must incorporate robust guard-rails and safety measures to ensure that AI models and technology cannot be manipulated for child sexual abuse purposes.
Furthermore, I reassure noble Lords that the offence of taking, making, distributing and possessing with a view to distribution any indecent photograph or pseudophotograph of a child under the age of 18 carries a maximum sentence of 10 years’ imprisonment. Possession alone of indecent photographs or pseudophotographs of children can carry a maximum sentence of up to five years’ imprisonment.
However, I am not able to accept the amendment, as the current drafting would capture legitimate AI models that have been deliberately misused by offenders without the knowledge or intent of their creators to produce child sexual abuse material. It would also inadvertently criminalise individual users who possess perfectly legal digital files with no criminal intent, due to the fact that they could, when combined, enable the creation of child sexual abuse material.
I therefore ask the noble Baroness to withdraw the amendment, while recognising the strength of feeling and the strong arguments made on this issue and reiterating my offer to meet with her to discuss this ahead of Report.
I do not know how to express in parliamentary terms the depth of my disappointment, so I will leave that. Whoever helped the noble Viscount draft his response should be ashamed. We do not have a comprehensive system and the police do not have the capability; they came to me after months of trying to get the Home Office to act, so that is an untruth: the police do not have the capability.
I remind the noble Viscount that in previous debates his response on the bigger picture of AI has been to wait and see, but this is a here and now problem. As the noble Baroness, Lady Jones, set out, this would give purpose and reason—and here it is in front of us; we can act.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones of Whitchurch, for tabling the amendments in this important group. I very much share the concerns about all the uses of deepfake images that are highlighted by these amendments. I will speak more briefly than I otherwise would with a view to trying to—
My Lords, I would be very happy to get a letter from the Minister.
I would be happy to write one. I will go for the abbreviated version of my speech.
I turn first to the part of the amendment that would seek to criminalise the creation, alteration or otherwise generation of deepfake images depicting a person engaged in an intimate act. The Government recognise that there is significant public concern about the simple creation of sexually explicit deepfake images, and this is why they have announced their intention to table an amendment to the Criminal Justice Bill, currently in the other place, to criminalise the creation of purposed sexual images of adults without consent.
The noble Lord’s Amendment 294 would create an offence explicitly targeting the creation or alteration of deepfake content when a person knows or suspects that the deepfake will be or is likely to be used to commit fraud. It is already an offence under Section 7 of the Fraud Act 2006 to generate software or deepfakes known to be designed for or intended to be used in the commission of fraud, and the Online Safety Act lists fraud as a priority offence and as a relevant offence for the duties on major services to remove paid-for fraudulent advertising.
Amendment 295 in the name of the noble Baroness, Lady Jones of Whitchurch, seeks to create an offence of creating or sharing political deepfakes. The Government recognise the threats to democracy that harmful actors pose. At the same time, the UK also wants to ensure that we safeguard the ability for robust debate and protect freedom of expression. It is crucial that we get that balance right.
Let me first reassure noble Lords that the UK already has criminal offences that protect our democratic processes, such as the National Security Act 2023 and the false communications offence introduced in the Online Safety Act 2023. It is also already an election offence to make false statements of fact about the personal character or conduct of a candidate or about the withdrawal of a candidate before or during an election. These offences have appropriate tests to ensure that we protect the integrity of democratic processes while also ensuring that we do not impede the ability for robust political debate.
I assure noble Lords that we continue to work across government to ensure that we are ready to respond to the risks to democracy from deepfakes. The Defending Democracy Taskforce, which seeks to protect the democratic integrity of the UK, is engaging across government and with Parliament, the UK’s intelligence community, the devolved Administrations, local authorities and others on the full range of threats facing our democratic institutions. We also continue to meet regularly with social media companies to ensure that they continue to take action to protect users from election interference.
Turning to Amendments 295A to 295F, I thank the noble Lord, Lord Clement-Jones, for them. Taken together, they would in effect establish a new regulatory regime in relation to the creation and dissemination of deepfakes. The Government recognise the concerns raised around harmful deepfakes and have already taken action against illegal content online. We absolutely recognise the intention behind these amendments but they pose significant risks, including to freedom of expression; I will write to noble Lords about those in order to make my arguments in more detail.
For the reasons I have set out, I am not able to accept these amendments. I hope that the noble Lord will therefore withdraw his amendment.
My Lords, I thank the Minister for that rather breathless response and his consideration. I look forward to his letter. We have arguments about regulation in the AI field; this is, if you like, a subset of that—but a rather important subset. My underlying theme is “must try harder”. I thank the noble Lord, Lord Leong, for his support and pay tribute to Control AI, which is vigorously campaigning on this subject in terms of the supply chain for the creation of these deepfakes.
Pending the Minister’s letter, which I look forward to, I beg leave to withdraw my amendment.
The Committee will be relieved to know that I will be brief. I do not have much to say because, in general terms, this seems an eminently sensible amendment.
We should congratulate the noble Lord, Lord Clement-Jones, on his drafting ingenuity. He has managed to compose an amendment that brings together the need for scrutiny of emerging national security and data privacy risks relating to advanced technology, aims to inform regulatory developments and guidance that might be required to mitigate risks, and would protect the privacy of people’s genomics data. It also picks up along the way the issue of the security services scrutinising malign entities and guiding researchers, businesses, consumers and public bodies. Bringing all those things together at the end of a long and rather messy Bill is quite a feat—congratulations to the noble Lord.
I am rather hoping that the Minister will tell the Committee either that the Government will accept this wisely crafted amendment or that everything it contains is already covered. If the latter is the case, can he point noble Lords to where those things are covered in the Bill? Can he also reassure the Committee that the safety and security issues raised by the noble Lord, Lord Clement-Jones, are covered? Having said all that, we support the general direction of travel that the amendment takes.
I would be extremely happy for the Minister to write.
Nothing makes me happier than the noble Lord’s happiness. I thank him for his amendment and the noble Lord, Lord Bassam, for his points; I will write to them on those, given the Committee’s desire for brevity and the desire to complete this stage tonight.
I wish to say some final words overall. I sincerely thank the Committee for its vigorous—I think that is the right word—scrutiny of this Bill. We have not necessarily agreed on a great deal, but I am in awe of the level of scrutiny and the commitment to making the Bill as good as possible. Let us be absolutely honest—this is not the most entertaining subject, but it is something that we all take extremely seriously and I pay tribute to the Committee for its work. I also extend sincere thanks to the clerks and our Hansard colleagues for agreeing to stay a little later than agreed, although that may not even be necessary. I very much look forward to engaging with noble Lords again before and during Report.
My Lords, I thank the Minister, the noble Baroness, Lady Jones, and all the team. I also thank the noble Lord, Lord Harlech, whose first name we now know; these things are always useful to know. This has been quite a marathon. I hope that we will have many conversations between now and Report. I also hope that Report is not too early as there is a lot to sort out. The noble Baroness, Lady Jones, and I will be putting together our priority list imminently but, in the meantime, I beg leave to withdraw my amendment.
(8 months ago)
Grand CommitteeMy Lords, I listened carefully to the explanation given by the noble Lord, Lord Clement-Jones, for his stand part notice on Clause 44. I will have to read Hansard, as I may have missed something, but I am not sure I am convinced by his arguments against Clause 44 standing part. He described his stand part notice as “innocuous”, but I am concerned that if the clause were removed it would have a slightly wider implication than that.
We feel that there are some advantages to how Clause 44 is currently worded. As it stands, it simply makes it clear that data subjects have to use the internal processes to make complaints to controllers first, and then the controller has the obligation to respond without undue delay. Although this could place an extra burden on businesses to manage and reply to complaints in a timely manner, I would have thought that this was a positive step to be welcomed. It would require controllers to have clear processes in place for handling complaints; I hope that that in itself would be an incentive against their conducting the kind of unlawful processing that prompts complaints in the first place. This seems the best practice, which would apply anyway in most organisations and complaint and arbitration systems, including, perhaps, ombudsmen, which I know the noble Lord knows more about than I do these days. There should be a requirement to use the internal processes first.
The clause makes it clear that the data subject has a right to complain directly to the controller and it makes clear that the controller has an obligation to respond. Clause 45 then goes on to make a different point, which is that the commissioner has a right to refuse to act on certain complaints. We touched on this in an earlier debate. Clearly, to be in line with Clause 44, the controller would have to have finished handling the case within the allotted time. We agree with that process. However, an alternative reason for the commissioner to refuse is when the complaint is “vexatious or excessive”. We have rehearsed our arguments about the interpretation of those words in previous debates on the application of subject access requests. I do not intend to repeat them here, but our concern about that wording rightly remains. What is important here is that the ICO should not be able to reject complaints simply because the complainant is distressed or angry. It is helpful that the clause states that in these circumstances,
“the Commissioner must inform the complainant”
of the reasons it is considered vexatious or excessive. It is also helpful that the clause states that this
“does not prevent the complainant from making it a complaint again”,
presumably in a way more compliant with the rules. Unlike the noble Lord, Lord Clement Jones—as I said, I will look at what he said in more detail—on balance, we are content with the wording as it stands.
On a slightly different tack, we have added our name to Amendment 154, in the name of the noble Lord, Lord Clement-Jones, and we support Amendment 287 on a similar subject. This touches on a similar principle to our previous debate on the right of data communities to raise data-breach complaints on behalf of individuals. In these amendments, we are proposing that there should be a collective right for organisations to raise data-breach complaints for individuals or groups of individuals who do not necessarily feel sufficiently empowered or confident to raise the complaints on their own behalf. There are many reasons why this reticence might occur, not least that the individuals may feel that making a complaint would put their employment on the line or that they would suffer discrimination at work in the future. We therefore believe that these amendments are important to widen people’s access to work with others to raise these complaints.
Since these amendments were tabled, we have received the letter from the Minister that addresses our earlier debate on data communities. I am pleased to see the general support for data intermediaries that he set out in his letter. We argue that a data community is a separate distinct collective body, which is different from the wider concept of data intermediaries. This seems to be an area in which the ICO could take a lead in clarifying rights and set standards. Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted.
The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue.
The noble Lord, Lord Clement-Jones, has tabled a number of amendments that modify the courts and tribunals functions. I was hoping that when I stood here and listened to him, I would understand a bit more about the issues. I hope he will forgive me for not responding in detail to these arguments. I do not feel that I know enough about the legal background to the concerns but he seems to have made a clear case in clarifying whether the courts or tribunals should have jurisdiction in data protection issues.
On that basis, I hope that the Minister will also provide some clarification on these issues and I look forward to his response.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for tabling these amendments to Clauses 44 and 45, which would reform the framework for data protection complaints to the Information Commissioner.
The noble Lord, Lord Clement-Jones, has given notice of his intention to oppose Clause 44 standing part of the Bill. That would remove new provisions from the Bill that have been carefully designed to provide a more direct route to resolution for data subjects’ complaints. I should stress that these measures do not limit rights for data subjects to bring complaints forward, but instead provide a more direct route to resolution with the relevant data controller. The measures formalise current best practice, requiring the complainant to approach the relevant data controller, where appropriate, to attempt to resolve the issue prior to regulatory involvement.
The Bill creates a requirement for data controllers to facilitate the making of complaints and look into what may have gone wrong. This should, in most cases, result in a much quicker resolution of data protection-related complaints. The provisions will also have the impact of enabling the Information Commissioner to redeploy resources away from handling premature complaints where such complaints may be dealt with more effectively, in the first instance, by controllers and towards value-added regulatory activity, supporting businesses to use data lawfully and in innovative ways.
The noble Lord’s Amendment 153 seeks, in effect, to expand the scope of the Information Commissioner’s duty to investigate complaints under Section 165 of the Data Protection Act. However, that Section of the Act already provides robust redress routes, requiring the commissioner to take appropriate steps to respond to complaints and offer an outcome or conclude an investigation within a specified period.
The noble Lord raised the enforcement of the UK’s data protection framework. I can provide more context on the ICO’s approach, although noble Lords will be aware that it is enforced independently of government by the ICO; it would of course be inappropriate for me to comment on how the ICO exercises its enforcement powers. The ICO aims to be fair, proportionate and effective, focusing on areas with the highest risk and most harm, but this does not mean that it will enforce every case that crosses its books.
The Government have introduced a new requirement on the ICO—Clause 43—to publish an annual report on how it has exercised its enforcement powers, the number and nature of investigations, the enforcement powers used, how long investigations took and the outcome of the investigations that ended in that period. This will provide greater transparency and accountability in the ICO’s exercise of its enforcement powers. For these reasons, I am not able to accept these amendments.
I also thank the noble Baroness and the noble Lord for their Amendments 154 and 287 concerning Section 190 of the Data Protection Act. These amendments would require the Secretary of State to legislate to give effect to Article 80(2) of the UK GDPR to enable relevant non-profit organisations to make claims against data controllers for alleged data breaches on behalf of data subjects, without those data subjects having requested or agreeing to the claim being brought. Currently, such non-profit organisations can already pursue such actions on behalf of individuals who have granted them specific authorisation, as outlined in Article 80(1).
In 2021, following consultation, the Government concluded that there was insufficient evidence to justify implementing Article 80(2) to allow non-profit organisations to bring data protection claims without the authorisation of the people affected. The Government’s response to the consultation noted that the regulator can and does investigate complaints raised by civil society groups, even when they are not made on behalf of named individuals. The ICO’s investigations into the use of live facial recognition technology at King’s Cross station and in some supermarkets in southern England are examples of this.
I also thank the noble Baroness, Lady Kidron, for raising her concerns about the protection of children throughout the debate—indeed, throughout all the days in Committee. The existing regime already allows civil society groups to make complaints to the ICO about data-processing activities that affect children and vulnerable people. The ICO has a range of powers to investigate systemic data breaches under the current framework and is already capable of forcing data controllers to take decisive action to address non-compliance. We are strengthening its powers in this Bill. I note that only a few member states of the EU have allowed non-governmental organisations to launch actions without a mandate, in line with the possibility provided by the GDPR.
I turn now to Amendments 154A, 154B—
Before the noble Lord gets there and we move too far from Amendment 154, where does the Government’s thinking leave us regarding a group of class actions? Trade unions take up causes on behalf of their membership at large. I guess, in the issue of the Post Office and Mr Bates, not every sub-postmaster or sub-postmistress would have signed up to that class action, even though they may have ended up being beneficiaries of its effects. So where does it leave people with regard to data protection and the way that the data protection scheme operates where there might be a class action?
If the action is raised on behalf of named individuals, those named individuals have to have given consent for that. If the action is for a general class of people, those people would not have to give their explicit consent, because they are not named in the action. Article 80(2) of the GDPR said that going that further step was optional for all member states. I do not know which member states have taken it up, but a great many have not, just because of the complexities to which it gives rise.
My Lords, just so that the Minister might get a little note, I will ask a question. He has explained what is possible—what can be done—but not why the Government still resist putting Article 80(2) into effect. What is the reason for not adopting that article?
The reason was that an extensive consultation was undertaken in 2021 by the Government, and the Government concluded at that time that there was insufficient evidence to take what would necessarily be a complex step. That was largely on the grounds that class actions of this type can go forward either as long as they have the consent of any named individuals in the class action or on behalf of a group of individuals who are unnamed and not specifically raised by name within the investigation itself.
Perhaps the Minister could in due course say what evidence would help to persuade the Government to adopt the article.
I want to help the Minister. Perhaps he could give us some more detail on the nature of that consultation and the number of responses and what people said in it. It strikes me as rather important.
Fair enough. Maybe for the time being, it will satisfy the Committee if I share a copy of that consultation and what evidence was considered, if that would work.
I will turn now to Amendments 154A to 155 and Amendment 175, which propose sweeping modifications to the jurisdiction of the court and tribunal for proceedings under the Data Protection Act 2018. These amendments would have the effect of making the First-tier Tribunal and Upper Tribunal responsible for all data protection cases, transferring both ongoing and future cases out of the court system and to the relevant tribunals.
The Government of course want to ensure that proceedings for enforcement of data protection rules, including redress routes available to data subjects, are appropriate for the nature of the complaint. As the Committee will be well aware, at present there is a mixture of jurisdiction for tribunals and courts under data protection legislation, depending on the precise nature of the proceedings in question. Tribunals are indeed the appropriate venue for some data protection proceedings, and the legislation already recognises that—for example, for application by data subjects for an order requiring the ICO to progress their complaint. However, courts are generally the more appropriate venue for cases involving claims for compensation and successful parties can usually recover their costs. Courts also apply stricter rules of procedure and evidence than tribunals. That is because some cases are appropriate to fall under the jurisdiction of the tribunal, while others are more appropriate for court jurisdiction. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensatory damages for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in accordance with its strict procedural and evidential rules, where the data subject may recover their costs if successful.
As such, the Government are confident that the current system is balanced and proportionate and provides clear and effective administrative and judicial redress routes for data subjects seeking to exercise their rights.
My Lords, is the Minister saying that there is absolutely no confusion between the jurisdiction of the tribunals and the courts? That is, no court has come to a different conclusion about jurisdiction—for example, as to whether procedural matters are for tribunals and merits are for courts or vice versa. Is he saying that everything is hunky-dory and clear and that we do not need to concern ourselves with this crossover of jurisdiction?
No, as I was about to say, we need to take these issues seriously. The noble Lord raised a number of specific cases. I was unfamiliar with them at the start of the debate—
I will go away and look at those; I look forward to learning more about them. There are obvious implications in what the noble Lord said as to the most effective ways of distributing cases between courts and other channels.
For these reasons, I hope that the noble Lord will withdraw his amendment.
I am intrigued by the balance between what goes to a tribunal and what goes to the courts. I took the spirit behind the stand-part notice in the name of the noble Lord, Lord Clement-Jones, as being about finding the right place for the right case and ensuring that the wheels of justice are much more accessible. I am not entirely persuaded by what the Minister has said. It would probably help the Committee if we had a better understanding of where the cases go, how they are distributed and on what basis.
I thank the noble Lord; that is an important point. The question is: how does the Sorting Hat operate to distribute cases between the various tribunals and the court system? We believe that the courts have an important role to play in this but it is about how, in the early stages of a complaint, the case is allocated to a tribunal or a court. I can see that more detail is needed there; I would be happy to write to noble Lords.
Before we come to the end of this debate, I just want to raise something. I am grateful to the Minister for offering to bring forward the 2021 consultation on Article 80(2)—that will be interesting—but I wonder whether, as we look at the consultation and seek to understand the objections, the Government would be willing to listen to our experiences over the past two or three years. I know I said this on our previous day in Committee but there is, I hope, some point in ironing out some of the problems of the data regime that we are experiencing in action. I could bring forward a number of colleagues on that issue and on why it is a blind spot for both the ICO and the specialist organisations that are trying to bring systemic issues to its attention. It is very resource-heavy. I want a bit of goose and gander here: if we are trying to sort out some of the resourcing and administrative nightmares in dealing with the data regime, from a user perspective, perhaps a bit of kindness could be shown to that problem as well as to the problem of business.
I would be very happy to participate in that discussion, absolutely.
My Lords, I thank the Minister for his response. I have surprised myself: I have taken something positive away from the Bill.
The noble Baroness, Lady Jones, was quite right to be more positive about Clause 44 than I was. The Minister unpacked its relationship with Clause 45 well and satisfactorily. Obviously, we will read Hansard before we jump to too positive a conclusion.
On Article 80(2), I am grateful to the Minister for agreeing both to go back to the consultation and to look at the kinds of evidence that were brought forward, because this is a really important aspect for many civil society organisations. He underestimates the difficulties faced when bringing complaints of this nature. I would very much like this conversation to go forward because this issue has been quite a bone of contention; the noble Baroness, Lady Kidron, remembers that only too well. We may even have had ping-pong on the matter back in 2017. There is an appetite to keep on the case so, the more we can discuss this matter—between Committee and Report in particular—the better, because there is quite a head of steam behind it.
As far as the jurisdiction point is concerned, I think this may be the first time I have heard a Minister talk about the Sorting Hat. I was impressed: I have often compared this place to Hogwarts but the concept of using the Sorting Hat to decide whether a case goes to a tribunal or a court is a wonderful one. You would probably need artificial intelligence to do that kind of thing nowadays; that in itself is a bit of an issue because, after all, these may be elaborate amendments but, as the noble Lord, Lord Bassam, said, the case being made here is about the possibility of there being confusion and things not being clear in terms of where jurisdiction lies. It is really important that we determine whether the courts and tribunals themselves understand this and, perhaps more appropriately, whether they have differing views about it.
We need to get to grips with this; the more the Minister can dig into it, and into Delo, Killock and so on, the better. We are all in the foothills here but I am certainly not going to try to unpack those two judgments and the differences between Mrs Justice Farbey and Mr Justice Mostyn, which are well beyond my competency. I thank the Minister.
My Lords, the UK has rightly moved away from the EU concept of supremacy, under which retained EU law would always take precedence over domestic law when they were in conflict. That is clearly unacceptable now that we have left the EU. However, we understand that the effective functioning of our data protection legislation is of critical importance and it is appropriate for us to specify the appropriate relationship between UK and EU-derived pieces of legislation following implementation of the Retained EU Law (Revocation and Reform) Act, or REUL. That is why I am introducing a number of specific government amendments to ensure that the hierarchy of legislation works in the data protection context. These are Amendments 156 to 164 and 297.
Noble Lords may be aware that Clause 49 originally sought to clarify the relationship between the UK’s data protection legislation, specifically the UK GDPR and EU-derived aspects of the Data Protection Act 2018, and future data processing provisions in other legislation, such as powers to share or duties to disclose personal data, as a result of some legal uncertainty created by the European Union (Withdrawal) Act 2018. To resolve this uncertainty, Clause 49 makes it clear that all new data processing provisions in legislation should be read consistently with the key requirements of the UK data protection legislation unless it is expressly indicated otherwise. Since its introduction, the interpretation of pre-EU exit legislation has been altered and there is a risk that this would produce the wrong effect in respect of the interpretation of existing data processing provisions that are silent about their relationship with the data protection legislation.
Amendment 159 will make it clear that the full removal of the principle of EU law supremacy and the creation of a reverse hierarchy in relation to assimilated direct legislation, as provided for in the REUL Act, do not change the relationship between the UK data protection legislation and existing legislation that is in force prior to commencement of Clause 49(2). Amendment 163 makes a technical amendment to the EU withdrawal Act, as amended, to support this amendment.
Amendment 162 is similar to the previous amendment but it concerns the relationship between provisions relating to certain obligations and rights under data protection legislation and on restrictions and prohibitions on the disclosure of information under other existing legislation. Existing Section 186 of the Data Protection Act 2018 governs this relationship. Amendment 162 makes it clear that the relationship between these two types of provision is not affected by the changes to the interpretation of legislation that I have already referred to made by the REUL Act. Additionally, it clarifies that, in relation to pre-commencement legislation, Section 186(1) may be disapplied expressly or impliedly.
Amendment 164 relates to the changes brought about by the REUL Act and sets out that the provisions detailed in earlier Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act.
Amendment 297 provides a limited power to remove provisions that achieve the same effect as new Section 183A from legislation made or passed after this Bill receives Royal Assent, as their presence could cause confusion.
Finally, Amendments 156 and 157 are consequential. Amendments 158, 160 and 161 are minor drafting changes made for consistency, updating and consequential purposes.
Turning to the amendments introduced by the noble Lord, Lord Clement-Jones, I hope that he can see from the government amendments to Clause 49 that we have given a good deal of thought to the impact of the REUL Act 2023 on the UK’s data protection framework and have been prepared to take action on this where necessary. We have also considered whether some of the changes made by the REUL Act could cause confusion about how the UK GDPR and the Data Protection Act 2018 interrelate. Following careful analysis, we have concluded that they would largely continue to be read alongside each other in the intended way, with the rules of the REUL Act unlikely to interfere with this. Any new general rule such as that suggested by the noble Lord could create confusion and uncertainty.
Amendments 168 to 170, 174, 174A and 174B seek to reverse changes introduced by the REUL Act at the end of 2023, specifically the removal of EU general principles from the statute book. EU general principles and certain EU-derived rights had originally been retained by the European Union (Withdrawal) Act to ensure legal continuity at the end of the transition period, but this was constitutionally novel and inappropriate for the long term.
The Government’s position is that EU law concepts should not be used to interpret domestic legislation in perpetuity. The REUL Act provided a solution to this by repealing EU general principles from UK law and clarifying the approach to be taken domestically. The amendments tabled by the noble Lord, Lord Clement-Jones, would undo this important work by reintroducing to the statute book references to rights and principles which have not been clearly defined and are inappropriate now that we have left the EU.
The protection of personal data already forms part of the protection offered by the European Convention on Human Rights, under the Article 8 right to respect for private and family life, and is further protected by our data protection legislation. The UK GDPR and the Data Protection Act 2018 provide a comprehensive set of rules for organisations to follow and rights for people in relation to the use of their data. Seeking to apply an additional EU right to data protection in UK law would not significantly affect the way the data protection framework functions or enhance the protections it affords to individuals. Indeed, doing so may well add unnecessary uncertainty and complexity.
Amendments 171 to 173 pertain to exemptions to specified data subject rights and obligations on data controllers set out in Schedules 2 to 4 to the DPA 2018. The 36 exemptions apply only in specified circumstances and are subject to various safeguards. Before addressing the amendments the noble Lord has tabled, it is perhaps helpful to set out how these exemptions are used. Personal data must be processed according to the requirements set out in the UK GDPR and the DPA 2018. This includes the key principles of lawfulness, fairness and transparency, data minimisation and purpose limitation, among others. The decision to restrict data subjects’ rights, such as the right to be notified that their personal data is being processed, or limit obligations on the data controller, comes into effect only if and when the decision to apply an exemption is taken. In all cases, the use of the exemption must be both necessary and proportionate.
One of these exemptions, the immigration exemption, was recently amended in line with a court ruling that found it was incompatible with the requirements set out in Article 23. This exemption is used by the Home Office. The purpose of Amendments 171 to 173 is to extend the protections applied to the immigration exemption across the other exemptions subject to Article 23, apart from in Schedule 4, where the requirement to consider whether its application prejudices the relevant purposes is not considered relevant.
The other exemptions are each used in very different circumstances, by different data controllers—from government departments to SMEs—and work by applying different tests that function in a wholly different manner from the immigration exemption. This is important to bear in mind when considering these broad-brush amendments. A one-size-fits-all approach would not work across the exemption regime.
It is the Government’s position that any changes to these important exemptions should be made only after due consideration of the circumstances of that particular exemption. In many cases, these amendments seek to make changes that run counter to how the exemption functions. Making changes across the exemptions via this Bill, as the noble Lord’s amendments propose, has the potential to have significant negative impacts on the functioning of the exemptions regime. Any potential amendments to the other exemptions would require careful consideration. The Government note that there is a power to make changes to the exemptions in the DPA 2018, if deemed necessary.
For the reasons I have given, I look forward to hearing more from the noble Lord on his amendments, but I hope that he will not press them. I beg to move.
My Lords, I thank the Minister for that very careful exposition. I feel that we are heavily into wet towel, if not painkiller, territory here, because this is a tricky area. As the Minister might imagine, I will not respond to his exposition in detail, at this point; I need to run away and get some external advice on the impact of what he said. He is really suggesting that the Government prefer a pick ‘n’ mix approach to what he regards as a one size fits all. I can boil it down to that. He is saying that you cannot just apply the rules, in the sense that we are trying to reverse some of the impacts of the previous legislation. I will set out my stall; no doubt the Minister and I, the Box and others, will read Hansard and draw our own conclusions at the end, because this is a complicated area.
Until the end of 2023, the Data Protection Act 2018 had to be read compatibly with the UK GDPR. In a conflict between the two instruments, the provisions of the UK GDPR would prevail. The reversing of the relationship between the 2018 Act and the UK GDPR, through the operation of the Retained EU Law (Revocation and Reform) Act—REUL, as the Minister described it—has had the effect of lowering data protection rights in the UK. The case of the Open Rights Group and the3million v the Secretary of State for the Home Office and the Secretary of State for Digital, Culture, Media and Sport was decided after the UK had left the EU, but before the end of 2023. The Court of Appeal held that exemptions from data subject rights in an immigration context, as set out in the Data Protection Act, were overly broad, contained insufficient safeguards and were incompatible with the UK GDPR. The court disapplied the exemptions and ordered the Home Office to redraft them to include the required safeguards. We debated the regulations the other day, and many noble Lords welcomed them on the basis that they had been revised for the second time.
This sort of challenge is now not possible, because the relationship between the DPA and the UK GDPR has been turned on its head. If the case were brought now, the overly broad exemptions in the DPA would take precedence over the requirement for safeguards set out in the UK GDPR. These points were raised by me in the debate of 12 December, when the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations 2023 were under consideration. In that debate, the noble Baroness, Lady Swinburne, stated that
“we acknowledge the importance of making sure that data processing provisions in wider legislation continue to be read consistently with the data protection principles in the UK GDPR … Replication of the effect of UK GDPR supremacy is a significant decision, and we consider that the use of primary legislation is the more appropriate way to achieve these effects, such as under Clause 49 where the Government consider it appropriate”.—[Official Report, 12/12/23; col. GC 203.]
This debate on Clause 49 therefore offers an opportunity to reinstate the previous relationship between the UK GDPR and the Data Protection Act. The amendment restores the hierarchy, so that it guarantees the same rights to individuals as existed before the end of 2023, and avoids unforeseen consequences by resetting the relationship between the UK GDPR and the DPA 2018 to what the parliamentary draftsmen intended when the Act was written. The provisions in Clause 49, as currently drafted, address the relationship between domestic law and data protection legislation as a whole, but the relationship between the UK GDPR and the DPA is left in its “reversed” state. This is confirmed in the Explanatory Notes to the Bill at paragraph 503.
The purpose of these amendments is to restore data protection rights in the UK to what they were before the end of 2023, prior to the coming into force of REUL. The amendments would restore the fundamental right to the protection of personal data in UK law; ensure that the UK GDPR and the DPA continue to be interpreted in accordance with the fundamental right to the protection of personal data; ensure that there is certainty that assimilated case law that references the fundamental right to the protection of personal data still applies; and apply the protections required in Article 23 of the UK GDPR to all the relevant exemptions in Schedule 2 to the Data Protection Act. This is crucial in avoiding diminishing trust in our data protection frameworks. If people do not trust that their data is protected, they will refuse to share it. Without this data, new technologies cannot be developed, because these technologies rely on personal data. By creating uncertainty and diminishing standards, the Government are undermining the very growth in new technologies that they want.
My Lords, I have looked at the government amendments in this group and have listened very carefully to what the Minister has said—that it is largely about interpretation. There are no amendments that I wish to comment on, save to say that they seem to be about consistency of language and bringing in part EU positions into UK law. They seem also to be about consistency of meaning, and for the most part the intention seems to be to ensure that nothing in EU retained law undoes the pre-existing legal framework.
However, I would appreciate the Minister giving us a bit more detail on the operation of Amendment 164. Amendment 297 seems to deal with a duplication issue, so perhaps he can confirm for the Committee that this is the case. We have had swathes of government amendments of a minor and technical nature, largely about chasing out gremlins from the drafting process. Can he confirm that this is the case and assure the Committee that we will not be left with any nasty surprises in the drafting that need correction at a later date?
The amendments tabled in the name of the noble Lord, Lord Clement-Jones, are of course of a different order altogether. The first two—Amendments 165 and 166—would restore the relationship between the UK GDPR and the 2018 Act and the relevant provisions of the Retained EU Law (Revocation and Reform) Act 2023. Amendment 168 would ensure that assimilated case law referring to the European Charter of Fundamental Rights would still be relevant in interpreting the UK GDPR. It would give greater certainty in how the UK’s data protection framework is interpreted. Amendment 169 would ensure that the interpretation is carried over from the UK GDPR and 2018 legislation in accordance with the general principle of the protection of personal data.
The noble Lord’s Amendments 170 to 174B would bring back into law protections that existed previously when UK law was more closely aligned with EU law and regulation. There is also an extension of the EU data protection of personal data to the assimilated standard that existed by virtue of Section 4 of the European Union (Withdrawal) Act 2018. I can well understand the noble Lord’s desire to take the UK back to a position where we are broadly in the same place in terms of protections as our former EU partners. First, having—broadly speaking—protections that are common across multiple jurisdictions makes it easier and simpler for companies operating in those markets. Secondly, from the perspective of data subjects, it is much easier to comprehend common standards of data protection and to seek redress when required. The Government, for their part, will no doubt argue that there is some sort of big Brexit benefit in this, although I think that advisers and experts are divided on the degree of that benefit, and indeed who benefits.
Later, we will get to discuss data adequacy standards. Concern exists in some quarters as to whether we have this right and what this legislative opportunity might be missing to ensure that the UK meets those international standards that the EU requires. That is a debate for later, but we are broadly sympathetic to the desire of the noble Lord, Lord Clement-Jones, to find the highest level of protection for UK citizens. That is the primary motivation for many of the amendments and debates that we have had today. We do not want to weaken what were previously carefully crafted and aligned protections. I do not entirely buy the argument that the Minister made earlier about this group of amendments causing legal uncertainty. I believe it is the reverse of that: the noble Lord, Lord Clement-Jones, is trying to provide greater certainty and a degree of jurisdictional uniformity.
I hope that I have understood what the noble Lord is trying to achieve here. For those reasons, we will listen to the Minister’s concluding comments—and read Hansard—very carefully.
I thank the noble Lords, Lord Clement-Jones and Lord Bassam, for their comments. As the noble Lord, Lord Clement-Jones, points out, it is a pretty complex and demanding area, but that in no way diminishes the importance of getting it right. I hope that in my remarks I can continue that work, but of course I am happy to discuss this: it is a very technical area and, as all speakers have pointed out, it is crucial for our purposes that it be executed correctly.
While the UK remains committed to strong protections for personal data through the UK GDPR and Data Protection Act, it is important that it is able to diverge from the EU legislation where this is appropriate for the UK. We have carefully assessed the effects of EU withdrawal legislation and the REUL Act and are making adjustments to ensure that the right effect is achieved. The government amendments are designed to ensure legal certainty and protect the coherence of the data protection framework following commencement of the REUL Act—for example, by maintaining the pre-REUL Act relationship in certain ways between key elements of the UK data protection legislation and other existing legislation.
The purpose of the REUL Act is to ensure that the UK has control over its laws. Resurrecting the principle of EU law supremacy in its entirety or continuing to apply case law principles is not consistent with the UK’s departure from the EU and taking back control over our own laws. These amendments make it clear that changes made to the application of the principle of EU law supremacy and new rules relating to the interpretation of direct assimilated legislation under the REUL Act do not have any impact on existing provisions that involve the processing of personal data.
The noble Lord, Lord Bassam, asked for more detail about Amendment 164. It relates to changes brought about by the REUL Act and sets out that the provisions detailed in Amendments 159, 162 and 163 are to be treated as having come into force on 1 January 2024—in other words, at the same time as commencement of the relevant provisions of the REUL Act. The retrospective effect of this provision addresses the gap between the commencement of the REUL Act 2023 and the Data Protection and Digital Information Bill.
On the immigration exemption case, I note that it was confined to the immigration exemption and did not rule on the other exemptions. The Government will continue to keep the exemptions under review and, should it be required, the Government have the power to amend the other exemptions using an existing power in the DPA 2018. Before doing so, of course the Government would want to ensure that due consideration is given to how the particular exemptions are used. Meanwhile, I thank noble Lords for what has been a fascinating, if demanding, debate.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Jones, and my noble friend Lord Kamall for their amendments. To address the elephant in the room first, I can reassure noble Lords that the use of digital identity will not be mandatory, and privacy will remain one of the guiding principles of the Government’s approach to digital identity. There are no plans to introduce a centralised, compulsory digital ID system for public services, and the Government’s position on physical ID cards remains unchanged. The Government are committed to realising the benefits of digital identity technologies without creating ID cards.
I shall speak now to Amendment 177, which would require the rules of the DVS trust framework to be set out in regulations subject to the affirmative resolution procedure. I recognise that this amendment, and others in this group, reflect recommendations from the DPRRC. Obviously, we take that committee very seriously, and we will respond to that report in due course, but ahead of Report.
Part 2 of the Bill will underpin the DVS trust framework, a document of auditable rules, which include technical standards. The trust framework refers to data protection legislation and ICO guidance. It has undergone four years of development, consultation and testing within the digital identity market. Organisations can choose to have their services certified against the trust framework to prove that they provide secure and trustworthy digital verification services. Certification is provided by independent conformity assessment bodies that have been accredited by the UK Accreditation Service. Annual reviews of the trust framework are subject to consultation with the ICO and other appropriate persons.
Requiring the trust framework to be set out in regulations would make it hard to introduce reactive changes. For example, if a new cybersecurity threat emerged which required the rapid deployment of a fix across the industry, the trust framework would need to be updated very quickly. Developments in this fast-growing industry require an agile approach to standards and rule-making. We cannot risk the document becoming outdated and losing credibility with industry. For these reasons, the Government feel that it is more appropriate for the Secretary of State to have the power to set the rules of the trust framework with appropriate consultation, rather than for the power to be exercised by regulations.
I turn to Amendments 178 to 195, which would require the fees that may be charged under this part of the Bill to be set out in regulations subject to the negative resolution procedure. The Government have committed to growing a market of secure and inclusive digital identities as an alternative to physical proofs of identity, for those that choose to use them. Fees will be introduced only once we are confident that doing so will not restrict the growth of this market, but the fee structure, when introduced, is likely to be complex and will need to flex to support growth in an evolving market.
There are built-in safeguards to this fee-charging power. First, there is a strong incentive for the Secretary of State to set fees that are competitive, fair and reasonable, because failing to do so would prevent the Government realising their commitment to grow this market. Secondly, these fee-raising powers have a well-defined purpose and limited scope. Thirdly, the Secretary of State will explain in advance what fees she intends to charge and when she intends to charge them, which will ensure the appropriate level of transparency.
The noble Baroness, Lady Jones, asked about the arrangements for the office for digital identities and attributes. It will not initially be independent, as it will be located within the Department for Science, Innovation and Technology. As we announced in the government response to our 2021 consultation, we intend for this to be an interim arrangement until a suitable long-term home for the governing body can be identified. Delegating the role of Ofdia—as I suppose we will call it—to a third party in the future, is subject to parliamentary scrutiny, as provided for by the clauses in the Bill. Initially placing Ofdia inside government will ensure that its oversight role could mature in the most effective way and that it supports the digital identity market in meeting the needs of individual users, relying parties and industry.
Digital verification services are independently certified against the trust framework rules by conformity assessment bodies. Conformity assessment bodies are themselves independently accredited by the UK Accreditation Service to ensure that they have the competence and impartiality to perform certification. The trust framework certification scheme will be accredited by the UK Accreditation Service to give confidence that the scheme can be efficiently and competently used to certify products, processes and services. All schemes will need to meet internationally agreed standards set out by the UK Accreditation Service. Ofdia, as the owner of the main code, will work with UKAS to ensure that schemes are robust, capable of certification and operated in line with the trust framework.
Amendment 184A proposes to exclude certified public bodies from registering to provide digital verification services. The term “public bodies” could include a wide range of public sector entities, including institutions such as universities, that receive any public funding. The Government take the view that this exclusion would be unnecessarily restrictive in the UK’s nascent digital identity market.
Amendment 195ZA seeks to mandate organisations to implement a non-digital form of verification in every instance where a digital method is required. The Bill enables the use of secure and inclusive digital identities across the economy. It does not force businesses or individuals to use them, nor does it insist that businesses which currently accept non-digital methods of verification must transition to digital methods. As Clause 52 makes clear, digital verification services are services that are provided at the request of the individual. The purpose of the Bill is to ensure that, when people want to use a digital verification service, they know which of the available products and services they can trust.
Some organisations operate only in the digital sphere, such as online-only banks and energy companies. To oblige such organisations to offer manual document checking would place obligations on them that would go beyond the Government’s commitment to do only what is necessary to enable the digital identity market to grow. In so far as this amendment would apply to public authorities, the Equality Act requires those organisations to consider how their services will affect people with protected characteristics, including those who, for various reasons, might not be able or might choose not to use a digital identity product.
Is the Minister saying that, as a result of the Equality Act, there is an absolute right to that analogue—if you like—form of identification if, for instance, someone does not have access to digital services?
On this point, the argument that the Government are making is that, where consumers want to use a digital verification service, all the Bill does is to provide a mechanism for those DVSs to be certified and assured to be safe. It does not seek to require anything beyond that, other than creating a list of safe DVSs.
The Equality Act applies to the public sector space, where it needs to be followed to ensure that there is an absolute right to inclusive access to digital technologies.
My Lords, in essence, the Minister is admitting that there is a gap when somebody who does not have access to digital services needs an identity to deal with the private sector. Is that right?
In the example I gave, I was not willing to use a digital system to provide a guarantee for my son’s accommodation in the private sector. I understand that that would not be protected and that, therefore, someone might not be able to rent a flat, for example, because they cannot provide physical ID.
The Bill does not change the requirements in this sense. If any organisation chooses to provide its services on a digital basis only, that is up to that organisation, and it is up to consumers whether they choose to use it. It makes no changes to the requirements in that space.
I will now speak to the amendment that seeks to remove Clause 80. Clause 80 enables the Secretary of State to ask accredited conformity assessment bodies and registered DVS providers to provide information which is reasonably required to carry out her functions under Part 2 of the Bill. The Bill sets out a clear process that the Secretary of State must follow when requesting this information, as well as explicit safeguards for her use of the power. These safeguards will ensure that DVS providers and conformity assessment bodies have to provide only information necessary for the functioning of this part of the Bill.
My Lords, the clause stand part amendment was clearly probing. Does the Minister have anything to say about the relationship with OneLogin? Is he saying that it is only information about systems, not individuals, which does not feed into the OneLogin identity system that the Government are setting up?
It is very important that the OneLogin system is entirely separate and not considered a DVS. We considered whether it should be, but the view was that that comes close to mandating a digital identity system, which we absolutely want to avoid. Hence the two are treated entirely differently.
That is a good reassurance, but if the Minister wants to unpack that further by correspondence, I would be very happy to have that.
I am very happy to do so.
I turn finally to Amendments 289 and 300, which aim to introduce a criminal offence of digital identity theft. The Government are committed to tackling fraud and are confident that criminal offences already exist to cover the behaviour targeted by these amendments. Under the Fraud Act 2006, it is a criminal offence to make a gain from the use of another person’s identity or to cause or risk a loss by such use. Where accounts or databases are hacked into, the Computer Misuse Act 1990 criminalises the unauthorised access to a computer programme or data held on a computer.
Furthermore, the trust framework contains rules, standards and good practice requirements for fraud monitoring and responding to fraud. These rules will further defend systems and reduce opportunities for digital identity theft.
My Lords, I am sorry, but this is a broad-ranging set of amendments, so I need to intervene on this one as well. When the Minister does his will write letter in response to today’s proceedings, could he tell us what guidance there is to the police on this? Because when the individual, Mr Arron, approached the police, they said, “Oh, sorry, there’s nothing we can do; identity theft is not a criminal offence”. The Minister seems to be saying, “No, it is fine; it is all encompassed within these provisions”. While he may be saying that, and I am sure he will be shouting it from the rooftops in the future, the question is whether the police have guidance; does the College of Policing have guidance and does the Home Office have guidance? The ordinary individual needs to know that it is exactly as the Minister says, and identity theft is covered by these other criminal offences. There is no point in having those offences if nobody knows about them.
That is absolutely fair enough: I will of course write. Sadly, we are not joined today by ministerial colleagues from the Home Office, who have some other Bill going on.
I have no doubt that its contribution to the letter will be equally enjoyable. However, for all the reasons I set out above, I am not able to accept these amendments and respectfully encourage the noble Baroness and noble Lords not to press them.
My Lords, I suppose I am meant to say that I thank the Minister for his response, but I cannot say that it was particularly optimistic or satisfying. On my amendments, the Minister said he would be responding to the DPRRC in due course, and obviously I am interested to see that response, but as the noble Lord, Lord Clement-Jones, said, the committee could not have been clearer and I thought made a very compelling case for why there should be some parliamentary oversight of this main code and, indeed, the fees arrangements.
I understand that it is a fast-moving sector, but the sort of things that the Delegated Powers Committee was talking about was that the main code should have some fundamental principles, some user rights and so on. We are not trying to spell out every sort of service that is going to be provided—as the Minister said, it is a fast-moving sector—but people need to have some trust in it and they need to know what this verification service is going to be about. Just saying that there is going to be a code, on such an important area, and that the Secretary of State will write it, is simply not acceptable in terms of basic parliamentary democracy. If it cannot be done through an affirmative procedure, the Government need to come up with another way to make sure that there is appropriate parliamentary input into what is being proposed here.
On the subject of the fees, the Delegated Powers Committee and our amendment was saying only that there should be a negative SI. I thought that was perfectly reasonable on its part and I am sorry that the Minister is not even prepared to accept that perfectly suggestion. All in all, I thought that the response on that element was very disappointing.
The response was equally disappointing on the whole issue that the noble Lords, Lord Kamall and Lord Vaux, raised about the right not to have to use the digital verification schemes but to do things on a non-digital basis. The arguments are well made about the numbers of people who are digitally excluded. I was in the debate that the noble Lord referred to, and I cannot remember the statistics now, but something like 17% of the population do not have proper digital access, so we are excluding a large number of people from a whole range of services. It could be applying for jobs, accessing bank accounts or applying to pay the rent for your son’s flat or whatever. We are creating a two-tier system here, for those who are involved and those who are on the margins who cannot use a lot of the services. I would have hoped that the Government would have been much more engaged in trying to find ways through that and providing some guarantees to people.
We know that we are taking a big leap, with so many different services going online. There is a lot of suspicion about how these services are going to work and people do not trust that computers are always as accurate as we would like them to be, so they would like to feel that there is another way of doing it if it all goes wrong. It worries me that the Minister is not able to give that commitment.
I have to say that I am rather concerned by what the Minister said about the private sector—in effect, that it can already have a requirement to have digital only. Surely, in this brave new world we are going towards, we do not want a digital-only service; this goes back to the point about a whole range of people being excluded. What is wrong with saying, even to people who collect people’s bank account details to pay their son’s rent, “There is an alternative way of doing this as well as you providing all the information digitally”? I am very worried about where all this is going, including who will be part of it and who will not. If the noble Lords, Lord Kamall and Lord Vaux, wish to pursue this at a later point, I would be sympathetic to their arguments.
On identity theft, the noble Lord, Lord Clement-Jones, made a compelling case. The briefing that he read out from the Metropolitan Police said that your data is one of your most valuable assets, which is absolutely right. He also rightly made the point that this is linked to organised crime. It does not happen by accident; some major people are farming our details and using them for all sorts of nefarious activities. There is a need to tighten up the regulation and laws on this. The Minister read out where he thinks this is already dealt with under existing legislation but we will all want to scrutinise that and see whether that really is the case. There are lots of examples of where the police have not been able to help people and do not know what their rights are, so we just need to know exactly what advice has been given to the police.
I feel that the Minister could have done more on this whole group to assure us that we are not moving towards a two-tier world. I will withdraw my amendment, obviously, but I have a feeling that we will come back to this issue; it may be something that we can talk to the Minister about before we get to Report.
My Lords, I thank the noble Baronesses, Lady Bennett, Lady Young of Old Scone and Lady Jones, for their proposed amendments on extending the definition of business data in smart data schemes, the disclosure of climate and nature information to improve public service delivery and the publication of an EU adequacy risk assessment.
On Amendment 195A, we consider that information about the carbon and energy intensity of goods, services or digital content already falls within the scope of “business data” as information about goods, services and digital content supplied or provided by a trader. Development of smart data schemes will, where relevant, be informed by—among other things—the Government’s Environmental Principles Policy Statement, under the Environment Act 2021.
With regard to Amendment 218, I thank the noble Baroness, Lady Young of Old Scone, for her sympathies; they are gratefully received. I will do my best in what she correctly pointed out is quite a new area for me. The powers to share information under Part 5 of the Digital Economy Act 2017—the DEA—are supplemented by statutory codes of practice. These require impact assessments to be carried out, particularly for significant changes or proposals that could have wide-ranging effects on various sectors or stakeholders. These impact assessments are crucial for understanding the implications of the Digital Economy Act and ensuring that it achieves its intended objectives, while minimising any negative consequences for individuals, businesses and society as a whole. As these assessments already cover economic, social and environmental impact, significant changes in approach are already likely to be accounted for. This is in addition to the duty placed on Ministers by the Environment Act 2021 to have due regard to the Environmental Principles Policy Statement.
Lastly, turning to Amendment 296, the Government are committed to maintaining their data adequacy decisions from the EU, which we absolutely recognise play a pivotal role in enabling trade and fighting crime. As noble Lords alluded to, we maintain regular engagement with the European Commission on the Bill to ensure that our reforms are understood.
The EU adequacy assessment of the UK is, of course, a unilateral, autonomous process for the EU to undertake. However, we remain confident that our reforms deliver against UK interests and are compatible with maintaining EU adequacy. As the European Commission itself has made clear, a third country—the noble Lord, Lord Clement-Jones, alluded to this point—is not required to have the same rules as the EU to be considered adequate. Indeed, 15 countries have EU adequacy, including Japan, Israel and the Republic of Korea. All these nations pursue independent and, often, more divergent approaches to data protection.
The Government will provide both written and oral evidence to the House of Lords European Affairs Committee inquiry on UK-EU data adequacy and respond to its final report, which is expected to be published in the summer. Many expert witnesses already provided evidence to the committee and have stated that they believe that the Bill is compatible with maintaining adequacy.
As noble Lords have noted, the Government have published a full impact assessment alongside the Bill, which sets out in more detail what both the costs and financial benefits of the Bill would be—including in the unlikely scenario of the EU revoking the UK’s adequacy decision. I also note that UK adequacy is good for the EU too: every EU company, from multinationals to start-ups, with customers, suppliers or operations in the UK relies on EU-UK data transfers. Leading European businesses and organisations have consistently emphasised the importance of maintaining these free flows of data to the UK.
For these reasons, I hope that the noble Baronesses will agree to withdraw or not move these amendments.
The Minister made the point at the end there that it is in the EU’s interest to agree to our data adequacy. That is an important point but is that what the Government are relying on—the fact that it is in the EU’s interest as much as ours to continue to agree to our data adequacy provisions? If so, what the Minister has said does not make me feel more reassured. If the Government are relying on just that, it is not a particularly strong argument.
My Lords, can I point out, on the interests of the EU, that it does not go just one way? There is a question around investment as well. For example, any large bank that is currently running a data-processing facility in this country that covers the whole of Europe may decide, if we lose data adequacy, to move it to Europe. Anyone considering setting up such a thing would probably go for Europe rather than here. There is therefore an investment draw for the EU here.
I do not know what I could possibly have said to create the impression that the Government are flying blind on this matter. We continue to engage extensively with the EU at junior official, senior official and ministerial level in order to ensure that our proposed reforms are fully understood and that there are no surprises. We engage with multiple expert stakeholders from both the EU side and the UK side. Indeed, as I mentioned earlier, a number of experts have submitted evidence to the House’s inquiry on EU-UK data adequacy and have made clear their views that the DPDI reforms set out in this Bill are compatible with EU adequacy. We continue to engage with the EU throughout. I do not want to be glib or blithe about the risks; we recognise the risks but it is vital—
Could we have a list of the people the noble Lord is talking about?
Yes. I would be happy to provide a list of the people we have spoken to about adequacy; it may be a long one. That concludes the remarks I wanted to make, I think.
Perhaps the Minister could just tweak that a bit by listing not just the people who have made positive noises but those who have their doubts.
I thank my noble friend Lord Holmes, the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, as well as other co-signatories for detailed examination of the Bill through these amendments.
I begin by addressing Amendments 197A, 197B and 197C tabled by my noble friend Lord Holmes, which seek to establish a biometrics office responsible for overseeing biometric data use, and place new obligations on organisations processing such data. The Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, and these functions will continue to sit with the new information commission, once established. For example, in March 2023 it investigated the use of live facial recognition in a retail security setting by Facewatch. In February 2024, it took action against Serco Leisure in relation to its use of biometric data to monitor attendance of leisure centre employees.
Schedule 15 to this Bill will also enable the information commission to establish committees of external experts with skills in any number of specialist areas, including biometrics, to provide specialist advice to the commission. Given that the Information Commissioner already has responsibility for monitoring and enforcing the processing of biometric data, the Government are therefore of the firm view that the information commission is best placed to continue to oversee the processing of biometric data. The Bill also allows the new information commission to establish specialist committees and require them to provide the commission with specialist advice. The committees may include specialists from outside the organisation, with key skills and expertise in specific areas, including biometrics.
The processing of biometric data for the purpose of uniquely identifying an individual is also subject to heightened safeguards, and organisations can process such data only if they meet one of the conditions of Article 9 of UK GDPR—for example, where processing is necessary to comply with employment law provisions, or for reasons of substantial public interest. Without a lawful basis and compliance with relevant conditions, such processing of biometric data is prohibited.
Amendments 197B and 197C in the name of my noble friend Lord Holmes would also impose new, prescriptive requirements on organisations processing, and intending to process, biometric data and setting unlimited fines for non-compliance. We consider that such amendments would have significant unintended consequences. There are many everyday uses of biometrics data, such as using your thumbprint to access your phone. If every organisation that launched a new product had to comply with the proposed requirements, it would introduce significant and unnecessary new burdens and would discourage innovation, undermining the aims of this Bill. For these reasons, I respectfully ask my noble friend not to move these amendments.
The Government deem Amendment 238 unnecessary, as using biometric data—
I am sorry, but I am wondering whether the Minister is going to say any more on the amendment in the name of the noble Lord, Lord Holmes. Can I be clear? The Minister said that the ICO is the best place to oversee these issues, but the noble Lord’s amendment recognises that; it just says that there should be a dedicated biometrics unit with specialists, et cetera, underneath it. I am looking towards the noble Lord—yes, he is nodding in agreement. I do not know that the Minister dismissed that idea, but I think that this would be a good compromise in terms of assuaging our concerns on this issue.
I apologise if I have misunderstood. It sounds like it would be a unit within the ICO responsible for that matter. Let me take that away if I have misunderstood—I understood it to be a separate organisation altogether.
The Government deem Amendment 238 unnecessary, as using biometric data to categorise or make inferences about people, whether using algorithms or otherwise, is already subject to the general data protection principles and the high data protection standards of the UK’s data protection framework as personal data. In line with ICO guidance, where the processing of biometric data is intended to make an inference linked to one of the special categories of data—for example, race or ethnic origin—or the biometric data is processed for the intention of treating someone differently on the basis of inferred information linked to one of the special categories of data, organisations should treat this as special category data. These protections ensure that this data, which is not used for identification purposes, is sufficiently protected.
Similarly, Amendment 286 intends to widen the scope of the Forensic Information Databases Service—FINDS—strategy board beyond oversight of biometrics databases for the purpose of identification to include “classification” purposes as well. The FINDS strategy board currently provides oversight of the national DNA database and the national fingerprint database. The Bill puts oversight of the fingerprint database on the same statutory footing as that of the DNA database and provides the flexibility to add oversight of new biometric databases, where appropriate, to provide more consistent oversight in future. The delegated power could be used in the medium term to expand the scope of the board to include a national custody image database, but no decisions have yet been taken. Of course, this will be kept under review, and other biometric databases could be added to the board’s remit in future should these be created and should this be appropriate. For the reasons I have set out, I hope that the noble Baroness, Lady Jones of Whitchurch, will therefore agree not to move Amendments 238 and 286.
Responses to the data reform public consultation in 2021 supported the simplification of the complex oversight framework for police use of biometrics and surveillance cameras. Clauses 147 and 148 of the Bill reflect that by abolishing the Biometrics and Surveillance Camera Commissioner’s roles while transferring the commissioner’s casework functions to the Investigatory Powers Commissioner’s Office.
Noble Lords referred to the CRISP report, which was commissioned by Fraser Sampson—the previous commissioner—and directly contradicts the outcome of the public consultation on data reform in 2021, including on the simplification of the oversight of biometrics and surveillance cameras. The Government took account of all the responses, including from the former commissioner, in developing the policies set out in the DPDI Bill.
There will not be a gap in the oversight of surveillance as it will remain within the statutory regulatory remit of other organisations, such as the Information Commissioner’s Office, the Equality and Human Rights Commission, the Forensic Science Regulator and the Forensic Information Databases Service strategy board.
One of the crucial aspects has been the reporting of the Biometrics and Surveillance Camera Commissioner. Where is there going to be and who is going to have a comprehensive report relating to the use of surveillance cameras and the biometric data contained within them? Why have the Government decided that they are going to separate out the oversight of biometrics from, in essence, the surveillance aspects? Are not the two irretrievably brought together by things such as live facial recognition?
Yes. There are indeed a number of different elements of surveillance camera oversight; those are reflected in the range of different bodies doing that it. As to the mechanics of the production of the report, I am afraid that I do not know the answer.
Does the Minister accept that the police are one of the key agencies that will be using surveillance cameras? He now seems to be saying, “No, it’s fine. We don’t have one single oversight body; we had four at the last count”. He probably has more to say on this subject but is that not highly confusing for the police when they have so many different bodies that they need to look at in terms of oversight? Is it any wonder that people think the Bill is watering down the oversight of surveillance camera use?
No. I was saying that there was extensive consultation, including with the police, and that that has resulted in these new arrangements. As to the actual mechanics of the production of an overall report, I am afraid that I do not know but I will find out and advise noble Lords.
His Majesty’s Inspectorate of Constabulary and Fire & Rescue Services also inspects, monitors and reports on the efficiency and effectiveness of the police, including their use of surveillance cameras. All of these bodies have statutory powers to take the necessary action when required. The ICO will continue to regulate all organisations’ use of these technologies, including being able to take action against those not complying with data protection law, and a wide range of other bodies will continue to operate in this space.
On the first point made by the noble Lord, Lord Vaux, where any of the privacy concerns he raises concern information that relates to an identified or identifiable living individual, I can assure him that this information is covered by the UK’s data protection regime. This also includes another issue raised by the noble Lord—where the ANPR captures a number-plate that can be linked to an identifiable living individual—as this would be the processing of personal data and thus governed by the UK’s data protection regime and regulated by the ICO.
For the reasons I have set out, I maintain that these clauses should stand part of the Bill. I therefore hope that the noble Lord, Lord Clement-Jones, will withdraw his stand part notices on Clauses 147 and 148.
Clause 149 does not affect the office of the Biometrics and Surveillance Camera Commissioner, which the noble Lord seeks to maintain through his amendment. The clause’s purpose is to update the name of the national DNA database board and update its scope to include the national fingerprint database within its remit. It will allow the board to produce codes of practice and introduce a new delegated power to add or remove biometric databases from its remit in future via the affirmative procedure. I therefore maintain that this clause should stand part of the Bill and hope that the noble Lord will withdraw his stand part notice.
Clauses 147 and 148 will improve consistency in the guidance and oversight of biometrics and surveillance cameras by simplifying the framework. This follows public consultation, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication. The Government feel that a review, as proposed, so quickly after the Bill is enacted is unnecessary. It is for these reasons that I cannot accept Amendment 292 in the name of the noble Lord, Lord Clement-Jones.
I turn now to the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove Clauses 130 to 132. These clauses make changes to the Counter-Terrorism Act 2008, which provides the retention regime for biometric data held on national security grounds. The changes have been made only following a formal request from Counter Terrorism Policing to the Home Office. The exploitation of biometric material, including from international partners, is a valuable tool in maintaining the UK’s national security, particularly for ensuring that there is effective tripwire coverage at the UK border. For example, where a foreign national applies for a visa to enter the UK, or enters the UK via a small boat, their biometrics can be checked against Counter Terrorism Policing’s holdings and appropriate action to mitigate risk can be taken, if needed.
My Lords, to go back to some of the surveillance points, one of the issues is the speed at which technology is changing, with artificial intelligence and all the other things we are seeing. One of the roles of the commissioner has been to keep an eye on how technology is changing and to make recommendations as to what we do about the impacts of that. I cannot hear, in anything the noble Viscount is saying, how that role is replicated in what is being proposed. Can he enlighten me?
Yes, indeed. In many ways, this is advantageous. The Information Commissioner obviously has a focus on data privacy, whereas the various other organisations, particularly BSCC, EHRC and the FINDS Board, have subject-specific areas of expertise on which they will be better placed to horizon-scan and identify new emerging risks from technologies most relevant to their area.
Is the noble Viscount saying that splitting it all up into multiple different places is more effective than having a single dedicated office to consider these things? I must say, I find that very hard to understand.
I do not think we are moving from a simple position. We are moving from a very complex position to a less complex position.
Can the Minister reassure the Committee that, under the Government’s proposals, there will be sufficient reporting to Parliament, every year, from all the various bodies to which he has already referred, so that Parliament can have ample opportunity to review the operation of this legislation as the Bill stands at the moment?
Yes, indeed. The information commission will be accountable to Parliament. It is required to produce transparency and other reports annually. For the other groups, I am afraid that many of them are quite new to me, as this is normally a Home Office area, but I will establish what their accountability is specifically to Parliament, for BSSC and the—
Will the Minister write to the Committee, having taken advice from his Home Office colleagues?
My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.
When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.
(8 months ago)
Grand CommitteeI welcome the Committee back after what I hope was a good Easter break for everybody. I thank all those noble Lords who, as ever, have spoken so powerfully in this debate.
I turn to Amendments 111 to 116 and 130. I thank noble Lords for their proposed amendments relating both to Schedule 5, which reforms the UK’s general processing regime for transferring personal data internationally and consolidates the relevant provisions in Chapter 5 of the UK GDPR, and to Schedule 7, which introduces consequential and transitional provisions associated with the reforms.
Amendment 111 seeks to revert to the current list of factors under the UK GDPR that the Secretary of State must consider when making data bridges. With respect, this more detailed list is not necessary as the Secretary of State must be satisfied that the standard of protection in the other country, viewed as a whole, is not materially lower than the standard of protection in the UK. Our new list of key factors is non-exhaustive. The UK courts will continue to be entitled to have regard to CJEU judgments if they choose to do so; ultimately, it will be for them to decide how much regard to have to any CJEU judgment on a similar matter.
I completely understand the strength of noble Lords’ concerns about ensuring that our EU adequacy decisions are maintained. This is also a priority for the UK Government, as I and my fellow Ministers have repeatedly made clear in public and on the Floor of the House. The UK is firmly committed to maintaining high data protection standards, now and in future. Protecting the privacy of individuals will continue to be a national priority. We will continue to operate a high-quality regime that promotes growth and innovation and underpins the trustworthy use of data.
Our reforms are underpinned by this commitment. We believe they are compatible with maintaining our data adequacy decisions from the EU. We have maintained a positive, ongoing dialogue with the EU to make sure that our reforms are understood. We will continue to engage with the European Commission at official and ministerial levels with a view to ensuring that our respective arrangements for the free flow of personal data can remain in place, which is in the best interests of both the UK and the EU.
We understand that Amendments 112 to 114 relate to representations made by the National AIDS Trust concerning the level of protection for special category data such as health data. We agree that the protection of people’s HIV status is vital. It is right that this is subject to extra protection, as is the case for all health data and special category data. As I have said before this Committee previously, we have met the National AIDS Trust to discuss the best solutions to the problems it has raised. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press these amendments.
Can the Minister just recap? He said that he met the trust then swiftly moved on without saying what solution he is proposing. Would he like to repeat that, or at least lift the veil slightly?
The point I was making was only that we have met with it and will continue to do so in order to identify the best possible way to keep that critical data safe.
The Minister is not suggesting a solution at the moment. Is it in the “too difficult” box?
I doubt that it will be too difficult, but identifying and implementing the correct solution is the goal that we are pursuing, alongside our colleagues at the National AIDS Trust.
I am sorry to keep interrogating the Minister, but that is quite an admission. The Minister says that there is a real problem, which is under discussion with the National AIDS Trust. At the moment the Government are proposing a significant amendment to both the GDPR and the DPA, and in this Committee they are not able to say that they have any kind of solution to the problem that has been identified. That is quite something.
I am not sure I accept that it is “quite something”, in the noble Lord’s words. As and when the appropriate solution emerges, we will bring it forward—no doubt between Committee and Report.
On Amendment 115, we share the noble Lords’ feelings on the importance of redress for data subjects. That is why the Secretary of State must already consider the arrangements for redress for data subjects when making a data bridge. There is already an obligation for the Secretary of State to consult the ICO on these regulations. Similarly, when considering whether the data protection test is met before making a transfer subject to appropriate safeguards using Article 46, the Government expect that data exporters will also give consideration to relevant enforceable data subject rights and effective legal remedies for data subjects.
Our rules mean that companies that transfer UK personal data must uphold the high data protection standards we expect in this country. Otherwise, they face action from the ICO, which has powers to conduct investigations, issue fines and compel companies to take corrective action if they fail to comply. We will continue to monitor and mitigate a wide range of data security risks, regardless of provenance. If there is evidence of threats to our data, we will not hesitate to take the necessary action to protect our national security.
My Lords, we heard from the two noble Lords some concrete examples of where those data breaches are already occurring, and it does not appear to me that appropriate action has been taken. There seems to be a mismatch between what the Minister is saying about the processes and the day-to-day reality of what is happening now. That is our concern, and it is not clear how the Government are going to address it.
The Minister mentioned prosecutions and legal redress in the UK from international data transfer breaches. Can he share some examples of that, maybe by letter? I am not aware of that being something with a long precedent.
A number of important points were raised there. Yes, of course I will share—
I am sorry to interrupt my noble friend, but the point I made—this now follows on from other remarks—was that these requirements have been in place for a long time, and we are seeing abuses. Therefore, I was hoping that my noble friend would be able to offer changes in the Bill that would put more emphasis on dealing with these breaches. Otherwise, as has been said, we look as though we are going backwards, not forwards.
As I said, a number of important points were raised there. First, I would not categorise the changes to Article 45 as watering down—they are intended to better focus the work of the ICO. Secondly, the important points raised with respect to Amendment 115 are points primarily relating to enforcement, and I will write to noble Lords setting out examples of where that enforcement has happened. I stress that the ICO is, as noble Lords have mentioned, an independent regulator that conducts the enforcement of this itself. What was described—I cannot judge for sure—certainly sounded like completely illegal infringements on the data privacy of those subjects. I am happy to look further into that and to write to noble Lords.
Amendment 116 seeks to remove a power allowing the Secretary of State to make regulations recognising additional transfer mechanisms. This power is necessary for the Government to react quickly to global trends and to ensure that UK businesses trading internationally are not held back. Furthermore, before using this power, the Secretary of State must be satisfied that the transfer mechanism is capable of meeting the new Article 46 data protection test. They are also required to consult with the Information Commissioner and such other persons felt appropriate. The affirmative resolution procedure will also ensure appropriate parliamentary scrutiny.
I reiterate that the UK Government’s assessment of the reforms in the Bill is that they are compatible with maintaining adequacy. We have been proactively engaging with the European Commission since the start of the Bill’s consultation process to ensure that it understands our reforms and that we have a positive, constructive relationship. Noble Lords will appreciate that it is important that officials have the ability to conduct candid discussions during the policy-making process. However, I would like to reassure noble Lords once again that the UK Government take the matter of retaining our adequacy decisions very seriously.
Finally, Amendment 130 pertains to EU exit transitional provisions in Schedule 21 to the Data Protection Act 2018, which provide that certain countries are currently deemed as adequate. These countries include the EU and EEA member states and those countries that the EU had found adequate at the time of the UK’s exit from the EU. Such countries are, and will continue to be, subject to ongoing monitoring. As is the case now, if the Secretary of State becomes aware of developments such as changes to legislation or specific practices that negatively impact data protection standards, the UK Government will engage with the relevant authorities and, where necessary, amend or revoke data bridge arrangements.
For these reasons, I hope noble Lords will not press their amendments.
My Lords, I thank the Minister for his response, but I am still absolutely baffled as to why the Government are doing what they are doing on Article 45. The Minister has not given any particular rationale. He has given a bit of a rationale for resisting the amendments, many of which try to make sure that Article 45 is fully effective, that these international transfers are properly scrutinised and that we remain data adequate.
By the way, I thought the noble Lord, Lord Kirkhope, made a splendid entry into our debate, so I hope that he stays on for a number of further amendments—what a début.
The only point on which I disagreed with the noble Lord, Lord Bethell—as the noble Baroness, Lady Jones, said—was when he said that this is a terrific Bill. It is a terrifying Bill, not a terrific one, as we have debated. There are so many worrying aspects—for example, that there is no solution yet for sensitive special category data and the whole issue of these contractual clauses. The Government seem almost to be saying that it is up to the companies to assess all this and whether a country in which they are doing business is data adequate. That cannot be right. They seem to be abrogating their responsibility for no good reason. What is the motive? Is it because they are so enthusiastic about transfer of data to other countries for business purposes that they are ignoring the rights of data subjects?
The Minister resisted describing this as watering down. Why get rid of the list of considerations that the Secretary of State needs to have so that they are just in the mix as something that may or may not be taken into consideration? In the existing article they are specified. It is quite a long list and the Government have chopped it back. What is the motive for that? It looks like data subjects’ rights are being curtailed. We were baffled by previous elements that the Government have introduced into the Bill, but this is probably the most baffling of all because of the real importance of this—its national security implications and the existing examples, such as Yandex, that we heard about from the noble Lord, Lord Kirkhope.
Of course we understand that there are nuances and that there is a difference between adequacy and equivalence. We have to be pragmatic sometimes, but the question of whether these countries having data transferred to them are adequate must be based on principle. This seems to me a prime candidate for Report. I am sure we will come back to it, but in the meantime I beg leave to withdraw.
My Lords, I am grateful to the noble Lord, Lord Bethell, and his cosignatories for bringing this comprehensive amendment before us this afternoon. As we have heard, this is an issue that was debated at length in the Online Safety Act. It is, in effect, unfinished business. I pay tribute to the noble Lords who shepherded that Bill through the House so effectively. It is important that we tie up the ends of all the issues. The noble Lord made significant progress, but those issues that remain unresolved come, quite rightly, before us now, and this Bill is an appropriate vehicle for resolving those outstanding issues.
As has been said, the heart of the problem is that tech companies are hugely protective of the data they hold. They are reluctant to share it or to give any insight on how their data is farmed and stored. They get to decide what access is given, even when there are potentially illegal consequences, and they get to judge the risk levels of their actions without any independent oversight.
During the course of the Online Safety Bill, the issue was raised not only by noble Lords but by a range of respected academics and organisations representing civil society. They supported the cross-party initiative from Peers calling for more independent research, democratic oversight and accountability into online safety issues. In particular, as we have heard, colleagues identified a real need for approved researchers to check the risks of non-compliance in the regulated sectors of UK law by large tech companies—particularly those with large numbers of children accessing the services. This arose because of the increasing anecdotal evidence that children’s rights were being ignored or exploited. The noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, have given an excellent exposition of the potential and real harms that continue to be identified by the lack of regulatory action on these issues.
Like other noble Lords, I welcome this amendment. It is well-crafted, takes a holistic approach to the problem, makes the responsibilities of the large tech companies clear and establishes a systematic research base of vetted researchers to check compliance. It also creates important criteria for the authorisation of those vetted researchers: the research must be in the public interest, must be transparent, must be carried out by respected researchers, and must be free from commercial interests so that companies cannot mark their own homework. As has been said, it mirrors the provisions in the EU Digital Services Act and ensures comparable research opportunities. That is an opportunity for the UK to maintain its status as one of the top places in the world for expertise on the impact of online harms.
Since the Online Safety Act was passed, the Information Commissioner has been carrying out further work on the children’s code of practice. The latest update report says:
“There has been significant progress and many organisations have started to assess and mitigate the potential privacy risks to children on their platforms”.
That is all well and good but the ICO and other regulators are still reliant on the information provided by the tech companies on how their data is used and stored and how they mitigate risk. Their responsibilities would be made much easier if they had access to properly approved and vetted independent research information that could inform their decisions.
I am grateful to noble Lords for tabling this amendment. I hope that the Minister hears its urgency and necessity and that he can assure us that the Government intend to table a similar amendment on Report—as the noble Baroness, Lady Kidron, said, no more “wait and see”. The time has come to stop talking about this issue and take action. Like the noble Lord, Lord Clement-Jones, I was in awe of the questions that the noble Baroness came up with and do not envy the Minister in trying to answer them all. She asked whether, if necessary, it could be done via a letter but I think that the time has come on this and some other issues to roll up our sleeves, get round the table and thrash it out. We have waited too long for a solution and I am not sure that exchanges of letters will progress this in the way we would hope. I hope that the Minister will agree to convene some meetings of interested parties—maybe then we will make some real progress.
My Lords, as ever, many thanks to all noble Lords who spoke in the debate.
Amendment 135, tabled by my noble friend Lord Bethell, would enable researchers to access data from data controllers and processors in relation to systemic risks to the UK and non-compliance with regulatory law. The regime would be overseen by the ICO. Let me take this opportunity to thank both my noble friend for the ongoing discussions we have had and the honourable Members in the other place who are also interested in this measure.
Following debates during the passage of the Online Safety Act, the Government have been undertaking further work in relation to access to data for online safety researchers. This work is ongoing and, as my noble friend Lord Bethell will be aware, the Government are having ongoing conversations on this issue. As he knows, the online safety regime is very broad and covers issues that have an impact on national security and fraud. I intend to write to the Committee with an update on this matter, setting out our progress ahead of Report, which should move us forward.
While we recognise the benefits of improving researchers’ access to data—for example, using data to better understand the impact of social media on users—this is a highly complex issue with several risks that are not currently well understood. Further analysis has reiterated the complexities of the issue. My noble friend will agree that it is vital that we get this right and that any policy interventions are grounded in the evidence base. For example, there are risks in relation to personal data protection, user consent and the disclosure of commercially sensitive information. Introducing a framework to give researchers access to data without better understanding these risks could have significant consequences for data security and commercially sensitive information, and could potentially destabilise any data access regime as it is implemented.
In the meantime, the Online Safety Act will improve the information available to researchers by empowering Ofcom to require major providers to publish a broad range of online safety information through annual transparency reports. Ofcom will also be able to appoint a skilled person to undertake a report to assess compliance or to develop its understanding of the risk of non-compliance and how to mitigate it. This may include the appointment of independent researchers as skilled persons. Further, Ofcom is required to conduct research into online harms and has the power to require companies to provide information to support this research activity.
Moving on to the amendment specifically, it is significantly broader than online safety and the EU’s parallel Digital Services Act regime. Any data controllers and processors would be in scope if they have more than 1 million UK users or customers, if there is a large concentration of child users or if the service is high-risk. This would include not just social media platforms but any organisation, including those in financial services, broadcasting and telecoms as well as any other large businesses. Although we are carefully considering international approaches to this issue, it is worth noting that much of the detail about how the data access provisions in the Digital Services Act will work in practice is yet to be determined. Any policy interventions in this space should be predicated on a robust evidence base, which we are in the process of developing.
The amendment would also enable researchers to access data to research systemic risks to compliance with any UK regulatory law that is upheld by the ICO, Ofcom, the Competition and Markets Authority, and the Financial Conduct Authority. The benefits and risks of such a broad regime are not understood and are likely to vary across sectors. It is also likely to be inappropriate for the ICO to be the sole regulator tasked with vetting researchers across the remits of the other regulators. The ICO may not have the necessary expertise to make this determination about areas of law that it does not regulate.
Ofcom already has the power to gather information that it requires for the purpose of exercising its online safety functions. This power applies to companies in scope of the duties and, where necessary, to other organisations or persons who may have relevant information. Ofcom can also issue information request notices to overseas companies as well as to UK-based companies. The amendment is also not clear about the different types of information that a researcher may want to access. It refers to a data controller and processors—concepts that relate to the processing of personal data under data protection law—yet researchers may also be interested in other kinds of data, such as information about a service’s systems and processes.
Although the Government continue to consider this issue—I look forward to setting out our progress between now and Report—for the reasons I have set out, I am not able to accept this amendment. I will certainly write to the Committee on this matter and to the noble Baroness, Lady Kidron, with a more detailed response to her questions—there were more than four of them, I think—in particular those about Ofcom.
Perhaps I could encourage the Minister to say at least whether he is concerned that a lack of evidence might be impacting on the codes and powers that we have given to Ofcom in order to create the regime. I share his slight regret that Ofcom does not have this provision that is in front of us. It may be that more than one regulator needs access to research data but it is the independents that we are talking about. We are not talking about Ofcom doing things and the ICO doing things. We are talking about independent researchers doing things so that the evidence exists. I would like to hear just a little concern that the regime is suffering from a lack of evidence.
I am thinking very carefully about how best to answer. Yes, I do share that concern. I will set this out in more detail when I write to the noble Baroness and will place that letter in the House of Lords Library. In the meantime, I hope that my noble friend will withdraw his amendment.
I am enormously grateful to the Minister for his response. However, it falls short of my hopes. Obviously, I have not seen the letter that he is going to send us, but I hope that the department will have taken on board the commitments made by previous Ministers during discussions on the Online Safety Bill and the very clear evidence that the situation is getting worse, not better.
Any hope that the tech companies would somehow have heard the debate in the House of Lords and that it would have occurred to them that they needed to step up to their responsibilities has, I am afraid, been dashed by their behaviours in the last 18 months. We have seen a serious withdrawal of existing data-sharing provisions. As we approach even more use of AI, the excitement of the metaverse, a massive escalation in the amount of data and the impact of their technologies on society, it is extremely sobering to think that there is almost no access to the black box of their data.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling these amendments and raising important points about the Information Commissioner’s independence and authority to carry out his role efficiently. The amendments from the noble Lord, Lord Clement-Jones, range widely, and I have to say that I have more sympathy with some of them than others.
I start by welcoming some of the things in the Bill—I am very pleased to be able to do this. It is important that we have an independent regulator that is properly accountable to Parliament, and this is vital for a properly functioning data protection regime. We welcome a number of the changes that have been made to the ICO’s role in the Bill. In particular, we think the move to have a board and a chief executive model, with His Majesty appointing the chair of the board, is the right way to go. We also welcome the strengthening of enforcement powers and the obligation to establish stakeholder panels to inform the content of codes of practice. The noble Baroness, Lady Kidron, also highlighted that.
However, we share the concern of the noble Lord, Lord Clement-Jones, about the Secretary of State’s requirement every three years to publish a statement of strategic priorities for the commissioner to consider, respond to and have regard to. We share his view, and that of many stakeholder groups, that this crosses the line into political involvement and exposes the ICO to unwarranted political direction and manipulation. We do not believe that this wording provides sufficient safeguards from that in its current form.
I have listened carefully to the explanation of the noble Lord, Lord Clement-Jones, of Amendment 138. I understand his concern, but we are going in a slightly different direction to him on this. We believe that the reality is that the ICO does not have the resources to investigate every complaint. He needs to apply a degree of strategic prioritisation in the public interest. I think that the original wording in the Bill, rather than the noble Lord’s amendment, achieved that objective more clearly.
Amendment 140, in the name of the noble Lord, Lord Clement-Jones, raises a significant point about businesses being given assured advice to ensure that they follow the procedures correctly, and we welcome that proposal. There is a role for leadership of the ICO in this regard. His proposal also addresses the Government’s concern that data controllers struggle to understand how they should be applying the rules. This is one of the reasons for many of the changes that we have considered up until now. I hope that the Minister will look favourably on this proposal and agree that we need to give more support to businesses in how they follow the procedures.
Finally, I have added my name to the amendment of the noble Baroness, Lady Kidron, which rightly puts a deadline on the production of any new codes of practice, and a deadline on the application of any transitional arrangements which apply in the meantime. We have started using the analogy of the codes losing their champions, and in general terms she is right. Therefore, it is useful to have a deadline, and that is important to ensure delivery. This seems eminently sensible, and I hope the Minister agrees with this too.
Amendment 150 from the noble Baroness, Lady Kidron, also requires the ICO annual report to spell out specifically the steps being taken to roll out the age-appropriate design code and to specifically uphold children’s data rights. Going back to the codes losing their champions, I am sure that the Minister got the message from the noble Baronesses, Lady Kidron and Lady Harding, that in this particular case, this is not going to happen, and that this code and the drive to deliver it will be with us for some time to come.
The noble Baroness, Lady Kidron, raised concerns about the approach of the ICO, which need to be addressed. We do not want a short-term approach but a longer-term approach, and we want some guarantees that the ICO is going to address some of the bigger issues that are being raised by the age-appropriate design code and other codes. Given the huge interest in the application of children’s data rights in this and other Bills, I am sure that the Information Commissioner will want to focus his report on his achievements in this space. Nevertheless, for the avoidance of doubt, it is useful to have it in the Bill as a specific obligation, and I hope the Minister agrees with the proposal.
We have a patchwork of amendments here. I am strongly in support of some; on others, perhaps the noble Lord and I can debate further outside this Room. In the meantime, I am interested to hear what the Minister has to say.
I thank the noble Lord, Lord Clement-Jones, the noble Baroness, Lady Kidron, and other noble Lords who have tabled and signed amendments in this group. I also observe what a pleasure it is to be on a Committee with Batman and Robin—which I was not expecting to say, and which may be Hansard’s first mention of those two.
The reforms to the Information Commissioner’s Office within the Bill introduce a strategic framework of objectives and duties to provide context and clarity on the commissioner’s overarching objectives. The reforms also put best regulatory practice on to a statutory footing and bring the ICO’s responsibilities into line with that of other regulators.
With regard to Amendment 138, the principal objective upholds data protection in an outcomes-focused manner that highlights the discretion of the Information Commissioner in securing those objectives, while reinforcing the primacy of data protection. The requirement to promote trust and confidence in the use of data will encourage innovation across current and emerging technologies.
I turn now to the question of Clause 32 standing part. As part of our further reforms, the Secretary of State can prepare a statement of strategic priorities for data protection, which positions these aims within its wider policy agenda, thereby giving the commissioner helpful context for its activities. While the commissioner must take the statement into account when carrying out functions, they are not required to act in accordance with it. This means that the statement will not be used in a way to direct what the commissioner may and may not do when carrying out their functions.
Turning to Amendment 140, we believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. This amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without necessarily full knowledge of the facts, undermining their regulatory enforcement role.
In response to the amendments concerning Clauses 33 to 35 standing part, I can say that we are introducing a series of measures to increase accountability, robustness and transparency in the codes of practice process, while safeguarding the Information Commissioner’s role. The requirements for impact assessments and panel of experts mean that the codes will consider the application to, and impact on, all potential use cases. Given that the codes will have the force of law, the Secretary of State must have the ability to give her or his comments. The Information Commissioner is required to consider but not to act on those comments, preserving the commissioner’s independence. It remains for Parliament to give approval for any statutory code produced.
Amendments 142 and 143 impose a requirement on the ICO to prepare codes and for the Secretary of State to lay them in Parliament as quickly as practicable. They also limit the time that transitional provisions can be in place to a maximum of 12 months. This could mean that drafting processes are truncated or valid concerns are overlooked to hit a statutory deadline, rather than the codes being considered properly to reflect the relevant perspectives.
Given the importance of ensuring that any new codes are robust, comprehensive and considered, we do not consider imposing time limits on the production of codes to be a useful tool.
Finally, Amendment 150—
We had this debate during the passage of the Online Safety Act. In the end, we all agreed—the House, including the Government, came to the view—that two and a half years, which is 18 months plus a transition period, was an almost egregious amount of time considering the rate at which the digital world moves. So, to consider that more than two and a half years might be required seems a little bit strange.
I absolutely recognise the need for speed, and my noble friend Lady Harding made this point very powerfully as well, but what we are trying to do is juggle that need with the need to go through the process properly to design these things well. Let me take it away and think about it more, to make sure that we have the right balancing point. I very much see the need; it is a question of the machinery that produces the right outcome in the right timing.
Before the Minister sits down, I would very much welcome a meeting, as the noble Baroness, Lady Harding, suggested. I do not think it is useful for me to keep standing up and saying, “You are watering down the code”, and for the Minister to stand up and say, “Oh no, we’re not”. We are not in panto here, we are in Parliament, and it would be a fantastic use of all our time to sit down and work it out. I would like to believe that the Government are committed to data protection for children, because they have brought forward important legislation in this area. I would also like to believe that the Government are proud of a piece of legislation that has spread so far and wide—and been so impactful—and that they would not want to undermine it. On that basis, I ask the Minister to accede to the noble Baroness’s request.
I am very happy to try to find a way forward on this. Let me think about how best to take this forward.
My Lords, I thank the Minister for his response and, in particular, for that exchange. There is a bit of a contrast here—the mood of the Committee is probably to go with the grain of these clauses and to see whether they can be improved, rather than throw out the idea of an information commission and revert to the ICO on the basis that perhaps the information commission is a more logical way of setting up a regulator. I am not sure that I personally agree, but I understand the reservations of the noble Baroness, Lady Jones, and I welcome her support on the aspect of the Secretary of State power.
We keep being reassured by the Minister, in all sorts of different ways. I am sure that the spirit is willing, but whether it is all in black and white is the big question. Where are the real safeguards? The proposals in this group from the noble Baroness, Lady Kidron, to which she has spoken to so well, along with the noble Baroness, Lady Harding, are very modest, to use the phrase from the noble Baroness, Lady Kidron. I hope those discussions will take place because they fit entirely with the architecture of the Bill, which the Government have set out, and it would be a huge reassurance to those who believe that the Bill is watering down data subject rights and is not strengthening children’s rights.
I am less reassured by other aspects of what the Minister had to say, particularly about the Secretary of State’s powers in relation to the codes. As the noble Baroness, Lady Kidron, said, we had a lot of discussion about that in relation to the Ofcom codes, under the Online Safety Bill, and I do not think we got very far on that either. Nevertheless, there is disquiet about whether the Secretary of State should have those powers. The Minister said that the ICO is not required to act in accordance with the advice of the Secretary of State so perhaps the Minister has provided a chink of light. In the meantime, I beg leave to withdraw the amendment.
My Lords, I have added my name to Amendment 146 in the name of the noble Baroness, Lady Kidron, and I thank all noble Lords who have spoken.
These days, most children learn to swipe an iPad long before they learn to ride a bike. They are accessing the internet at ever younger ages on a multitude of devices. Children are choosing to spend more time online, browsing social media, playing games and using apps. However, we also force children to spend an increasing amount of time online for their education. A growing trend over the last decade or more, this escalated during the pandemic. Screen time at home became lesson time; it was a vital educational lifeline for many in lockdown.
Like other noble Lords, I am not against edtech, but the reality is that the necessary speed of the transition meant that insufficient regard was paid to children’s rights and the data practices of edtech. The noble Baroness, Lady Kidron, as ever, has given us a catalogue of abuses of children’s data which have already taken place in schools, so there is a degree of urgency about this, and Amendment 146 seeks to rectify the situation.
One in five UK internet users are children. Schools are assessing their work online; teachers are using online resources and recording enormous amounts of sensitive data about every pupil. Edtech companies have identified that such a large and captive population is potentially profitable. This amendment reinforces that children are also a vulnerable population and that we must safeguard their data and personal information on this basis. Their rights should not be traded in as the edtech companies chase profits.
The code of practice proposed in this amendment establishes standards for companies to follow, in line with the fundamental rights and freedoms as set out in the UN Convention on the Rights of the Child. It asserts that they are entitled to a higher degree of protection than adults in the digital realm. It would oblige the commissioner to prepare a code of practice which ensures this. It underlines that consultations with individuals and organisations who have the best interests of children at heart is vital, so that the enormous edtech companies cannot bamboozle already overstretched teachers and school leaders.
In education, data has always been processed from children in school. It is necessary for the school’s functioning and to monitor the educational development of individual children. Edtech is now becoming a permanent fixture in children’s schooling and education, but it is largely untested, unregulated and unaccountable. Currently, it is impossible to know what data is collected by edtech providers and how they are using it. This blurs the boundaries between the privacy-preserving and commercial parts of services profiting from children’s data.
Why is this important? First, education data can reveal particularly sensitive and protected characteristics about children: their ethnicity, religion, disability or health status. Such data can also be used to create algorithms that profile children and predict or assess their academic ability and performance; it could reinforce prejudice, create siloed populations or entrench low expectations. Secondly, there is a risk that data-profiling children can lead to deterministic outcomes, defining too early what subjects a child is good at, how creative they are and what they are interested in. Safeguards must be put in place in relation to the processing of children’s personal data in schools to protect those fundamental rights. Thirdly, of course, is money. Data is appreciating in value, resulting in market pressure for data to be collected, processed, shared and reused. Increasingly, such data processed from children in schools is facilitated by edtech, an already major and expanding sector with a projected value of £3.4 billion.
The growth of edtech’s use in schools is promoted by the Department for Education’s edtech strategy, which sets out a vision for edtech to be an
“inseparable thread woven throughout the processes of teaching and learning”.
Yet the strategy gives little weight to data protection beyond noting the importance of preventing data breaching. Tech giants have become the biggest companies in the world because they own data on us. Schoolchildren have little choice as to their involvement with these companies in the classroom, so we have a moral duty to ensure that they are protected, not commodified or exploited, when learning. It must be a priority for the Government to keep emerging technologies in education under regular review.
Equally important is that the ICO should invest in expertise specific to the domain of education. By regularly reviewing emerging technologies—those already in use and those proposed for use—in education, and their potential risks and impacts, such experts could provide clear and timely guidance for schools to protect individual children and entire cohorts. Amendment 146 would introduce a new code of practice on the processing and use of children’s data by edtech providers. It would also ensure that edtech met their legal obligations under the law, protected children’s data and empowered schools.
I was pleased to hear that the noble Baroness, Lady Kidron, has had constructive discussions with the Education Minister, the noble Baroness, Lady Barran. The way forward on this matter is some sort of joint work between the two departments. The noble Baroness, Lady Kidron, said that she hopes the Minister today will respond with equal positivity; he could start by supporting the principles of this amendment. Beyond that, I hope that he will agree to liaise with the Department for Education and embrace the noble Baroness’s request for more meetings to discuss this issue on a joint basis.
I am grateful, as ever, to the noble Baroness, Lady Kidron, for both Amendment 146 and her continued work in championing the protection of children.
Let me start by saying that the Government strongly agree with the noble Baroness that all providers of edtech services must comply with the law when collecting and making decisions about the use of children’s data throughout the duration of their processing activities. That said, I respectfully submit that this amendment is not necessary, for the reasons I shall set out.
The ICO already has existing codes and guidance for children and has set out guidance about how the children’s code, data protection and e-privacy legislation apply to edtech providers. Although the Government recognise the value that ICO codes can have in promoting good practice and improving compliance, they do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by them.
The guidance covers broad topics, including choosing a lawful basis for the processing; rules around information society services; targeting children with marketing; profiling children or making automated decisions about them; data sharing; children’s data rights; and exemptions relating to children’s data. Separately, as we have discussed throughout this debate, the age-appropriate design code deals specifically with the provision of online services likely to be accessed by children in the UK; this includes online edtech services. I am pleased to say that the Department for Education has begun discussions with commercial specialists to look at strengthening the contractual clauses relating to the procurement of edtech resources to ensure that they comply with the standards set out in the UK GDPR and the age-appropriate design code.
On the subject of requiring the ICO to develop a report with the edtech sector, with a view to creating a certification scheme and assessing compliance and conformity with data protection, we believe that such an approach should be at the discretion of the independent regulator.
The issues that have been raised in this very good, short debate are deeply important. Edtech is an issue that the Government are considering carefully—especially the Department for Education, given the increasing time spent online for education. I note that the DPA 2018 already contains a power for the Secretary of State to request new codes of practice, which could include one on edtech if the evidence warranted it. I would be happy to return to this in future but consider the amendment unnecessary at this time. For the reasons I have set out, I am not able to accept the amendment and hope that the noble Baroness will withdraw it.
I thank everyone who spoke, particularly for making it absolutely clear that not one of us, including myself, is against edtech. We just want it to be fair and want the rules to be adequate.
I am particularly grateful to the noble Baroness, Lady Jones, for detailing what education data includes. It might feel as though it is just about someone’s exam results or something that might already be public but it can include things such as how often they go to see the nurse, what their parents’ immigration status is or whether they are late. There is a lot of information quite apart from this personalised education provision, to which the noble Baroness referred. In fact, we have a great deal of emerging evidence that it has no pedagogical background to it. There is also the question of huge investment right across the sector in things where we do not know what they are. I thank the noble Baroness for that.
As to the Minister’s response, I hope that he will forgive me for being disappointed. I am grateful to him for reminding us that the Secretary of State has that power under the DPA 2018. I would love for her to use that power but, so far, it has not been forthcoming. The evidence we saw from the freedom of information request is that the scheme the department wanted to put in place has been totally retracted—and clearly for resource reasons rather than because it is not needed. I find it quite surprising that the Minister can suggest that it is all gung ho here in the UK but that Germany, Holland, France, et cetera are being hysterical in regard to this issue. Each one of them has found it to be egregious.
Finally, the AADC applies only to internet society services; there is an exception for education. Where they are joint controllers, they are outsourcing the problems to the schools, which have no level of expertise in this and just take default settings. It is not good enough, I am afraid. I feel bound to say this: I understand the needs of parliamentary business, which puts just a handful of us in this Room to discuss things out of sight, but, if the Government are not willing to protect children’s data at school, when they are in loco parentis to our children, I am really bewildered as to what this Bill is for. Education is widely understood to be a social good but we are downgrading the data protections for children and rejecting every single positive move that anybody has made in Committee. I beg leave to withdraw my amendment but I will bring this back on Report.
(8 months, 3 weeks ago)
Grand CommitteeMy Lords, I thank all noble Lords who have contributed to this debate. We have had a major common theme, which is that any powers exercised by the Secretary of State in Clause 14 should be to enhance, rather than diminish, the protections for a data subject affected by automated decision-making. We have heard some stark and painful examples of the way in which this can go wrong if it is not properly regulated. As noble Lords have said, this seems to be regulation on automated decision-making by the backdoor, but with none of the protections and promises that have been made on this subject.
Our Amendment 59 goes back to our earlier debate about rights at work when automated decision-making is solely or partly in operation. It provides an essential underpinning of the Secretary of State’s powers. The Minister has argued that ADM is a new development and that it would be wrong to be too explicit about the rules that should apply as it becomes more commonplace, but our amendment cuts through those concerns by putting key principles in the Bill. They are timeless principles that should apply regardless of advances in the adoption of these new technologies. They address the many concerns raised by workers and their representatives, about how they might be disfranchised or exploited by machines, and put human contact at the heart of any new processes being developed. I hope that the Minister sees the sense of this amendment, which will provide considerable reassurance for the many people who fear the impact of ADM in their working lives.
I draw attention to my Amendments 58 and 73, which implement the recommendations of the Delegated Powers and Regulatory Reform Committee. In the Bill, the new Articles 22A to 22D enable the Secretary of State to make further provisions about safeguards when automated decision-making is in place. The current wording of new Article 22D makes it clear that regulations can be amended
“by adding or varying safeguards”.
The Delegated Powers Committee quotes the department saying that
“it does not include a power to remove safeguards provided in new Article 22C and therefore cannot be exercised to weaken the protections”
afforded to data subjects. The committee is not convinced that the department is right about this, and we agree with its analysis. Surely “vary” means that the safeguards can move in either direction—to improve or reduce protection.
The committee also flags up concerns that the Bill’s amendments to Sections 49 and 50 of the Data Protection Act make specific provision about the use of automated decision-making in the context of law enforcement processing. In this new clause, there is an equivalent wording, which is that the regulations may add or vary safeguards. Again, we agree with its concerns about the application of these powers to the Secretary of State. It is not enough to say that these powers are subject to the affirmative procedure because, as we know and have discussed, the limits on effective scrutiny of secondary legislation are manifest.
We have therefore tabled Amendments 58 and 73, which make it much clearer that the safeguards cannot be reduced by the Secretary of State. The noble Lord, Lord Clement-Jones, has a number of amendments with a similar intent, which is to ensure that the Secretary of State can add new safeguards but not remove them. I hope the Minister is able to commit to taking on board the recommendations of the Delegated Powers Committee in this respect.
The noble Baroness, Lady Kidron, once again made the powerful point that the Secretary of State’s powers to amend the Data Protection Act should not be used to reduce the hard-won standards and protections for children’s data. As she says, safeguards do not constitute a right, and having regard to the issues is a poor substitute for putting those rights back into the Bill. So I hope the Minister is able to provide some reassurance that the Bill will be amended to put these hard-won rights back into the Bill, where they belong.
I am sorry that the noble Lord, Lord Holmes, is not here. His amendment raises an important point about the need to build in the views of the Information Commissioner, which is a running theme throughout the Bill. He makes the point that we need to ensure, in addition, that a proper consultation of a range of stakeholders goes into the Secretary of State’s deliberations on safeguards. We agree that full consultation should be the hallmark of the powers that the Secretary of State is seeking, and I hope the Minister can commit to taking those amendments on board.
I echo the specific concerns of the noble Lord, Lord Clement-Jones, about the impact assessment and the supposed savings from changing the rules on subject access requests. This is not specifically an issue for today’s debate but, since it has been raised, I would like to know whether he is right that the savings are estimated to be 50% and not 1%, which the Minister suggested when we last debated this. I hope the Minister can clarify this discrepancy on the record, and I look forward to his response.
I thank the noble Lords, Lord Clement-Jones and Lord Knight, my noble friend Lord Holmes and the noble Baronesses, Lady Jones, Lady Kidron and Lady Bennett—
I apologise to my noble friend. I cannot be having a senior moment already—we have only just started. I look forward to reading that part in Hansard.
I can reassure noble Lords that data subjects still have the right to object to solely automated decision-making. It is not an absolute right in all circumstances, but I note that it never has been. The approach taken in the Bill complements the UK’s AI regulation framework, and the Government are committed to addressing the risks that AI poses to data protection and wider society. Following the publication of the AI regulation White Paper last year, the Government started taking steps to establish a central AI risk function that brings together policymakers and AI experts with the objective of identifying, assessing and preparing for AI risks. To track identified risks, we have established an initial AI risk register, which is owned by the central AI risk function. The AI risk register lists individual risks associated with AI that could impact the UK, spanning national security, defence, the economy and society, and outlines their likelihood and impact. We have also committed to engaging on and publishing the AI risk register in spring this year.
I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
I may be tired or just not very smart, but I am not really sure that I understand how being less prescriptive and more adaptive can heighten safeguards. Can my noble friend the Minister elaborate a little more and perhaps give us an example of how that can be the case?
Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.
This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.
I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.
I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.
Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.
I am having a senior moment as well. Where are the outcomes written? What are we measuring this against? I like the idea; it sounds great—management terminology—but I presume that it is written somewhere and that we could easily add children’s rights to the outcomes as the noble Baroness suggests. Where are they listed?
I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.
I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.
I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.
As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.
Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—
Has the Minister moved on from our Amendments 58 and 59? He was talking about varying safeguards. I am not quite sure where he is.
It is entirely my fault; when I sit down and stand up again, I lose my place.
We would always take the views of the DPRRC very seriously on that. Clearly, the Bill is being designed without the idea in mind of losing or diminishing any of those safeguards; otherwise, it would have simply said in the Bill that we could do that. I understand the concern that, by varying them, there is a risk that they would be diminished. We will continue to find a way to take into account the concerns that the noble Baroness has set out, along with the DPRRC. In the interim, let me perhaps provide some reassurance that that is, of course, not the intention.
I feel under amazing pressure to get the names right, especially given the number of hours we spend together.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.
Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.
The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.
In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.
More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.
For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.
My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?
I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.
I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.
As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.
On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
It is already explicit that:
“Activities addressed specifically to children shall receive specific attention”.
That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.
Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.
I have a question on this. If the Minister is arguing that this should be by way of amendment of the age-related code, would there not be an argument for giving that code some statutory effect?
On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.
Yes, I entirely agree with that, but I add that we need it upstream and downstream.
For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—
Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.
Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.
Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.
Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.
Some 50,000 organisations access that information, but does the Government have any data on it? I am not asking for it now, but maybe the Minister could go away and have a look at this. We have heard that other countries have opened up this data. Are they seeing an increase? That is just a number; it does not tell us how many people are denied access to the data.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
My Lords, I thank the Minister for his response. There are a number of different elements to this group.
The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.
My Lords, I thank all noble Lords who have contributed to this very wide-ranging debate. Our amendments cover a lot of common ground, and we are in broad agreement on most issues, so I hope noble Lords will bear with me if I primarily focus on the amendments that I have tabled, although I will come back to other points.
We have given notice of our intention to oppose Clause 16 standing part of the Bill which is similar to Amendment 80 tabled by the noble Lord, Lord Clement-Jones, which probes why the Government have found it necessary to remove the requirement that companies outside the UK should appoint a representative within the UK. The current GDPR rules apply to all those active in the UK market, regardless of whether their organisation is based or located in the UK. The intention is that the representative will ensure UK compliance and act as a primary source of contact for data subjects. Without this clause, data subjects will be forced to deal with overseas data handlers, with all the cultural and language barriers that might ensue. There is no doubt that this will limit their rights to apply UK data standards.
In addition, as my colleagues in the Commons identified, the removal of the provisions in Clause 16 was not included in the Government’s consultation, so stakeholders have not had the chance to register some of the many practical concerns that they feel will arise from this change. There is also little evidence that compliance with Article 27 is an unnecessary barrier to responsible data use by reputable overseas companies. Again, this was a point made by the noble Lord, Lord Clement-Jones. In fact, the international trend is for more countries to add a representative obligation to their data protection laws, so we are becoming outriders on the global stage.
Not only is this an unnecessary change but, compared to other countries, it will send a signal that our data protection rights are being eroded in the UK. Of course, this raises the spectre of the EU revisiting whether our UK adequacy status should be retained. It also has implications for the different rules that might apply north and south of the border in Ireland so, again, if we are moving away from the standard rules applied by other countries, this has wider implications that we need to consider.
For many reasons, I challenge the Government to explain why this change was felt to be necessary. The noble Lord, Lord Clement-Jones, talked about whether the cost was really a factor. It did not seem that there were huge costs, compared to the benefits of maintaining the current system, and I would like to know in more detail why the Government are doing this.
Our Amendments 81 and 90 seek to ensure that there is a definition of “high-risk processing” in the Bill. The current changes in Clauses 17 and 20 have the effect of watering down data controllers’ responsibilities, from carrying out data protection impact assessments to assessing high-risk processing on the basis of whether it was necessary and what risks are posed. But nowhere does it say what constitutes high-risk processing—it is left to individual organisations to make that judgment—and nowhere does it explain what “necessary” means in this context. Is it also expected to be proportionate, as in the existing standards? This lack of clarity has caused some consternation among stakeholders.
The Equality and Human Rights Commission argues that the proposed wording means that
“data controllers are unlikely to go beyond minimum requirements”,
so the wording needs to be more explicit. It also recommends that
“the ICO be required to provide detailed guidance on how ‘the rights and freedoms of individuals’ are to be considered in an Assessment of High Risk Processing”.
More crucially, the ICO has written to Peers, saying that the Bill should contain a list of
“activities that government and Parliament view as high-risk processing, similar to the current list set out at Article 35(3) of the UK GDPR”.
This is what our Amendments 81 and 90 aim to achieve. I hope the Minister can agree to take these points on board and come back with amendments to achieve this.
The ICO also makes the case for future-proofing the way in which high-risk processing is regulated by making a provision in the Bill for the ICO to further designate high-risk processing activities with parliamentary approval. This would go further than the current drafting of Clause 20, which contains powers for the ICO to give examples of high-risk profiling, but only for guidance. Again, I hope that the Minister can agree to take these points on board and come back with suitable amendments.
Our Amendments 99, 100 and 102 specify the need for wider factors in the proposed risk assessment list to ensure that it underpins our equality laws. Again, this was an issue about which stakeholders have raised concerns. The TUC and the Institute for the Future of Work make the point that data protection impact assessments are a crucial basis for consultation with workers and trade unions about the use of technology at work, and this is even more important as the complexities of AI come on stream. The Public Law Project argues that, without rigorous risk and impact analysis, disproportionate and discriminatory processes could be carried out before the harm comes to light.
The Equality and Human Rights Commission argues that data protection impact assessments
“provide a key mechanism for ensuring equality impacts are assessed when public and private sector organisations embed AI systems in their operations”.
It specifically recommends that express references in Article 35(7) of GDPR to “legitimate interests” and
“the rights and freedoms of data subjects”,
as well as the consultation obligations in Article 35(2), should be retained. I hope that the Minister can agree to take these recommendations on board and come back with suitable amendments to ensure that our equalities legislation is protected.
Our Amendments 106 and 108 focus on the particular responsibilities of data controllers to handle health data with specific obligations. This is an issue that we know, from previous debates, is a major cause for concern among the general public, who would be alarmed if they thought that the protections were being weakened.
The BMA has raised concerns that Clauses 20 and 21 will water down our high standards of data governance, which are necessary when organisations are handling health data. As it says,
“Removing the requirement to conduct a thorough assessment of risks posed to health data is likely to lead to a less diligent approach to data protection for individuals”.
It also argues that removing the requirement for organisations to consult the ICO on high-risk processing is,
“a backward step from good governance … when organisations are processing large quantities of sensitive health data.
Our amendments aim to address these concerns by specifying that, with regard to specific cases, such as the handling of health data, prior consultation with the ICO should remain mandatory. I hope that the Minister will see the sense in these amendments and recognise that further action is needed in this Bill to maintain public trust in how health data is managed for individual care and systemwide scientific development.
I realise that we have covered a vast range of issues, but I want to touch briefly on those raised by the noble Baroness, Lady Kidron. She is right that, in particular, applications of risk assessments by public bodies should be maintained, and we agree with her that Article 35’s privacy-by-design requirements should be retained. She once again highlighted the downgrading of children’s rights in this Bill, whether by accident or intent, and we look forward to seeing the exchange of letters with the Minister on this. I hope that we will all be copied in and that the Minister will take on board the widespread view that we should have more engagement on this before Report, because there are so many outstanding issues to be resolved. I look forward to the Minister’s response.
I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.
The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.
However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.
One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.
The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.
That is a very interesting question, but I am not sure that there is a read-across between the AI Act and our approach here. The fundamental starting point was that, although the provisions of the original GDPR are extremely important, the burdens of compliance were not proportionate to the results. The overall foundation of the DPDI is, while at least maintaining existing levels of protection, to reduce the burdens of demonstrating or complying with that regulation. That is the thrust of it—that is what we are trying to achieve—but noble Lords will have different views about how successful we are being at either of those. It is an attempt to make it easier to be safe and to comply with the regulations of the DPDI and the other Acts that govern data protection. That is where we are coming from and the thrust of what we are trying to achieve.
I note that, as we have previously discussed, children need particular protection when organisations are collecting and processing their personal data.
I did not interrupt before because I thought that the Minister would say more about the difference between high-risk and low-risk processing, but he is going on to talk about children. One of my points was about the request from the Information Commissioner—it is very unusual for him to intervene. He said that a list of high-risk processing activities should be set out in the Bill. I do not know whether the Minister was going to address that important point.
I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.
My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.
I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.
Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.
Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.
However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.
I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.
By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.
Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?
No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.
The Minister mentioned the freedom to choose the best solution. Would it be possible for someone to be told that their contact was someone who spoke a different language to them? Do they have to be able to communicate properly with the data subjects in this country?
Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.
I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.
The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.
Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.
I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.
The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.
Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.
The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.
The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.
Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?
Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.
Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.
The Minister has reached his 20 minutes. We nudged him at 15 minutes.
My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.
As long as that applies to us on occasion as well.
I apologise for going over. I will try to be as quick as possible.
I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.
Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.
Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.
Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.
Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe
“how the controller proposes to mitigate those risks”.
I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.
My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.
With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.
At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.
I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.
Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.
As the noble Lord, Lord Clement-Jones, explained, his intention to oppose the question that Clause 19 stands part seeks to retain the status quo. As I read Section 62 of the Data Protection Act 2016, it obliges competent authorities to keep logs of their processing activities, whether they be for collection, alteration, consultation, disclosure, combination or the erasing of personal data. The primary purpose is for self-monitoring purposes, largely linked to disciplinary proceedings, as the noble Lord said, where an officer has become a suspect by virtue of inappropriately accessing PNC-held data.
Clause 19 removes the requirement for a competent authority to record a justification in the logs only when consulting or disclosing personal data. The Explanatory Note to the Bill explains this change as follows:
“It is … technologically challenging for systems to automatically record the justification without manual input”.
That is not a sufficiently strong reason for removing the requirement, not least because the remaining requirements of Section 62 of the Data Protection Act 2018 relating to the logs of consultation and disclosure activity will be retained and include the need to record the date and time and the identity of the person accessing the log. Presumably they will be able to be manually input, so why remove the one piece of data that might, in an investigation of abuse or misuse of the system, be useful in terms of evidence and self-incrimination? I do not understand the logic behind that at all.
I rather think the noble Lord, Lord Clement-Jones, has an important point. He has linked it to those who have been unfortunate enough to be AIDS sufferers, and I am sure that there are other people who have become victims where cases would be brought forward. I am not convinced that the clause should stand part, and we support the noble Lord in seeking its deletion.
This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.
The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.
In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.
Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.
I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.
I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.
This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.
The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.
I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.
My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.
As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.
What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.
It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.
I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?
In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.
I thank the noble Baroness, Lady Kidron, for tabling these amendments. I absolutely recognise their intent. I understand that they are motivated by a concern about invisible types of processing or repurposing of data when it may not be clear to people how their data is being used or how they can exercise their rights in respect of the data.
On the specific points raised by noble Lords about intellectual property rather than personal data, I note that, in their response to the AI White Paper consultation, the Government committed soon to provide a public update on their approach to AI and intellectual property, noting the importance of greater transparency in the use of copyrighted material to train models, as well as labelling and attribution of outputs.
Amendment 103 would amend the risk-assessment provisions in Clause 20 so that any assessment of high-risk processing would always include an assessment of how the data controller would comply with the purpose limitation principle and how any new processing activity would be designed so that people could exercise their rights in respect of the data at the time it was collected and at any subsequent occasion.
I respectfully submit that this amendment is not necessary. The existing provisions in Clause 20, on risk assessments, already require controllers to assess the potential risks their processing activities pose to individuals and to describe how those risks would be mitigated. This would clearly include any risk that the proposed processing activities would not comply with the data protection principles—for example, because they lacked transparency—and would make it impossible for people to exercise their rights.
Similarly, any assessment of risk would need to take account of any risks related to difficulties in complying with the purpose limitation principle—for example, if the organisation had no way of limiting who the data would be shared with as a result of the proposed processing activity.
According to draft ICO guidance on generative AI, the legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR can be a valid lawful ground for training generative AI models on web-scrape data, but only when the model’s developer can ensure that they pass the three-part test—that is, they identify a legitimate interest, demonstrate that the processing is necessary for that purpose and demonstrate that the individual’s interests do not override the interest being pursued by the controller.
Controllers must consider the balancing test particularly carefully when they do not or cannot exercise meaningful control over the use of the model. The draft guidance further notes that it would be very difficult for data controllers to carry out their processing activities in reliance on the legitimate interests lawful ground if those considerations were not taken into account.
My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.
Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.
I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.
We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.
We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.
In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.
Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.
We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.
Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.
In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.
I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.
My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.
Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.
I thank the noble Lord, Lord Clement-Jones, for his amendment and his response, and I thank the noble Lord, Lord Bassam. The mechanism for monitoring international transfers was intended to be the subject for the next group in any case, and I would have hoped to give a full answer. I know we are all deeply disappointed that it looks as if we may not get to that group but, if the noble Lord is not willing to wait until we have that debate, I am very happy to write.
(8 months, 3 weeks ago)
Grand CommitteeMy Lords, I rise to speak to Amendments 11, 12, 13, 14, 15, 16, 17 and 18 and to whether Clauses 5 and 7 should stand part of the Bill. In doing so, I thank the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Jones and Lady Kidron, for their amendments. The amendments in the group, as we have heard, relate to Clauses 5 and 7, which make some important changes to Article 6 of the UK GDPR on the lawfulness of processing.
The first amendment in the group, Amendment 11, would create a new lawful ground, under Article 6(1) of UK GDPR, to enable the use of personal data published by public bodies with a person’s consent and to enable processing by public bodies for the benefit of the wider public. The Government do not believe it would be necessary to create additional lawful grounds for processing in these circumstances. The collection and publication of information on public databases, such as the list of company directors published by Companies House, should already be permitted by existing lawful grounds under either Article 6(1)(c), in the case of a legal requirement to publish information, or Article 6(1)(e) in the case of a power.
Personal data published by public bodies can already be processed by other non-public body controllers where their legitimate interests outweigh the rights and interests of data subjects. However, they must comply with their requirements in relation to that personal data, including requirements to process personal data fairly and transparently. I am grateful to the noble Lord, Lord Clement-Jones, for setting out where he thinks the gaps are, but I hope he will accept my reassurances that it should already be possible under the existing legislation and will agree to withdraw the amendment.
On Clause 5, the main objectives introduce a new lawful ground under Article 6(1) of the UK GDPR, known as “recognised legitimate interests”. It also introduces a new annexe to the UK GDPR, in Schedule 1 to the Bill, that sets out an exhaustive list of processing activities that may be undertaken by data controllers under this new lawful ground. If an activity appears on the list, processing may take place without a person’s consent and without balancing the controller’s interests against the rights and interests of the individual: the so-called legitimate interests balancing test.
The activities in the annexe are all of a public interest nature, for example, processing of data where necessary to prevent crime, safeguarding national security, protecting children, responding to emergencies or promoting democratic engagement. They also include situations where a public body requests a non-public body to share personal data with it to help deliver a public task sanctioned by law.
The clause was introduced as a result of stakeholders’ concerns raised in response to the public consultation Data: A New Direction in 2021. Some informed us that they were worried about the legal consequences of getting the balancing test in Article 6(1)(f) wrong. Others said that undertaking the balancing test can lead to delays in some important processing activities taking place.
As noble Lords will be aware, many data controllers have important roles in supporting activities that have a public interest nature. It is vital that data is shared without delay where necessary in areas such as safeguarding, prevention of crime and responding to emergencies. Of course, controllers who share data while relying on this new lawful ground would still have to comply with wider requirements of data protection legislation where relevant, such as data protection principles which ensure that the data is used fairly, lawfully and transparently, and is collected and used for specific purposes.
In addition to creating a new lawful ground of recognised legitimate interests, Clause 5 also clarifies the types of processing activities that may be permitted under the existing legitimate interests lawful ground under Article 6(1)(f) of the UK GDPR. Even if a processing activity does not appear on the new list of recognised legitimate interests, data controllers may still have grounds for processing people’s data without consent if their interests in processing the data are not outweighed by the rights and freedoms that people have in relation to privacy. Clause 5(9) and (10) makes it clear this might be the case in relation to many common commercial activities, such as intragroup transfers.
My Lords, may I just revisit that with the Minister? I fear that he is going to move on to another subject. The Delegated Powers Committee said that it thought that the Government had not provided strong enough reasons for needing this power. The public interest list being proposed, which the Minister outlined, is quite broad, so it is hard to imagine the Government wanting something not already listed. I therefore return to what the committee said. Normally, noble Lords like to listen to recommendations from such committees. There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests.
Indeed. Needless to say, we take the recommendations of the DPRRC very seriously, as they deserve. However, because this is an exhaustive list, and because the technologies and practices around data are likely to evolve very rapidly in ways we are unable currently to predict, it is important to retain as a safety measure the ability to update that list. That is the position the Government are coming from. We will obviously continue to consider the DPRRC’s recommendations, but that has to come with a certain amount of adaptiveness as we go. Any addition to the list would of course be subject to parliamentary debate, via the affirmative resolution procedure, as well as the safeguards listed in the provision itself.
Clause 50 ensures that the ICO and any other interested persons should be consulted before making regulations.
Amendments 15, 16, 17 and 18 would amend the part of Clause 5 that is concerned with the types of activities that might be carried out under the current legitimate interest lawful ground, under Article 6(1)(f). Amendment 15 would prevent direct marketing organisations relying on the legitimate interest lawful ground under Article 6(1)(f) if the personal data being processed related to children. However, the age and vulnerability in general of data subjects is already an important factor for direct marketing organisations when considering whether the processing is justified. The ICO already provides specific guidance for controllers carrying out this balancing test in relation to children’s data. The fact that a data subject is a child, and the age of the child in question, will still be relevant factors to take into account in this process. For these reasons, the Government consider this amendment unnecessary.
My Lords, am I to take it from that that none of the changes currently in the Bill will expose children on a routine basis to direct marketing?
As is the case today and will be going forward, direct marketing organisations will be required to perform the balancing test; and as in the ICO guidance today and, no doubt, going forward—
I am sorry if I am a little confused—I may well be—but the balancing test that is no longer going to be there allows a certain level of processing, which was the subject of the first amendment. The suggestion now is that children will be protected by a balancing test. I would love to know where that balancing test exists.
The balancing test remains there for legitimate interests, under Article 6(1)(f).
Amendment 16 seeks to prevent organisations that undertake third-party marketing relying on the legitimate interest lawful ground under Article 6(1)(f) of the UK GDPR. As I have set out, organisations can rely on that ground for processing personal data without consent when they are satisfied that they have a legitimate interest to do so and that their commercial interests are not outweighed by the rights and interests of data subjects.
Clause 5(4) inserts in Article 6 new paragraph (9), which provides some illustrative examples of activities that may constitute legitimate interests, including direct marketing activities, but it does not mean that they will necessarily be able to process personal data for that purpose. Organisations will need to assess on a case-by-case basis where the balance of interest lies. If the impact on the individual’s privacy is too great, they will not be able to rely on the legitimate interest lawful ground. I should emphasise that this is not a new concept created by this Bill. Indeed, the provisions inserted by Clause 5(4) are drawn directly from the recitals to the UK GDPR, as incorporated from the EU GDPR.
I recognise that direct marketing can be a sensitive—indeed, disagreeable—issue for some, but direct marketing information can be very important for businesses as well as individuals and can be dealt with in a way that respects people’s privacy. The provisions in this Bill do not change the fact that direct marketing activities must be compliant with the data protection and privacy legislation and continue to respect the data subject’s absolute right to opt out of receiving direct marketing communications.
Amendment 17 would make sure that the processing of employee data for “internal administrative purposes” is subject to heightened safeguards, particularly when it relates to health. I understand that this amendment relates to representations made by the National AIDS Trust concerning the level of protection afforded to employees’ health data. We agree that the protection of people’s HIV status is vital and that it is right that it is subject to extra protection, as is the case for all health data and special category data. We have committed to further engagement and to working with the National AIDS Trust to explore solutions in order to prevent data breaches of people’s HIV status, which we feel is best achieved through non-legislative means given the continued high data protection standards afforded by our existing legislation. As such, I hope that the noble Lord, Lord Clement-Jones, will agree not to press this amendment.
Amendment 18 seeks to allow businesses more confidently to rely on the existing legitimate interest lawful ground for the transmission of personal data within a group of businesses affiliated by contract for internal administrative purposes. In Clause 5, the list of activities in proposed new paragraphs (9) and (10) are intended to be illustrative of the types of activities that may be legitimate interests for the purposes of Article 6(1)(f). They are focused on processing activities that are currently listed in the recitals to the EU GDPR but are simply examples. Many other processing activities may be legitimate interests for the purposes of Article 6(1)(f) of the UK GDPR. It is possible that the transmission of personal data for internal administrative purposes within a group affiliated by contract may constitute a legitimate interest, as may many other commercial activities. It would be for the controller to determine this on a case-by-case basis after carrying out a balancing test to assess the impact on the individual.
Finally, I turn to the clause stand part debate that seeks to remove Clause 7 from the Bill. I am grateful to the noble Lord, Lord Clement-Jones, for this amendment because it allows me to explain why this clause is important to the success of the UK-US data access agreement. As noble Lords will know, that agreement helps the law enforcement agencies in both countries tackle crime. Under the UK GDPR, data controllers can process personal data without consent on public interest grounds if the basis for the processing is set out in domestic law. Clause 7 makes it clear that the processing of personal data can also be carried out on public interest grounds if the basis for the processing is set out in a relevant international treaty such as the UK-US data access agreement.
The agreement permits telecommunications operators in the UK to disclose data about serious crimes with law enforcement agencies in the US, and vice versa. The DAA has been operational since October 2022 and disclosures made by UK organisations under it are already lawful under the UK GDPR. Recent ICO guidance confirms this, but the Government want to remove any doubt in the minds of UK data controllers that disclosures under the DAA are permitted by the UK GDPR. Clause 7 makes it absolutely clear to telecoms operators in the UK that disclosures under the DAA can be made in reliance on the UK GDPR’s public tasks processing grounds; the clause therefore contributes to the continued, effective functioning of the agreement and to keeping the public in both the UK and the US safe.
For these reasons, I hope that the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment.
My first reaction is “Phew”, my Lords. We are all having to keep to time limits now. The Minister did an admirable job within his limit.
I wholeheartedly support what the noble Baronesses, Lady Kidron and Lady Harding, said about Amendments 13 and 15 and what the noble Baroness, Lady Jones, said about her Amendment 12. I do not believe that we have yet got to the bottom of children’s data protection; there is still quite some way to go. It would be really helpful if the Minister could bring together the elements of children’s data about which he is trying to reassure us and write to us saying exactly what needs to be done, particularly in terms of direct marketing directed towards children. That is a real concern.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding and Lady Bennett, after the excellent introduction to the amendments in this group by the noble Baroness, Lady Jones. The noble Baroness, Lady Harding, used the word “trust”, and this is another example of a potential hidden agenda in the Bill. Again, it is destructive of any public trust in the way their data is curated. This is a particularly egregious example, without, fundamentally, any explanation. Sir John Whittingdale said that a future Government
“may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules”.—[Official Report, Commons, 29/11/2023; col. 885.]
Nothing to see here—all very innocuous; but, as we know, in the past the ICO has been concerned about even the current rules on the use of data by political parties. It seems to me that, without being too Pollyannaish about this, we should be setting an example in the way we use the public’s data for campaigning. The ICO, understandably, is quoted as saying during the public consultation on the Bill that this is
“an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
That seems an understatement, but that is how regulators talk. It is entirely right to be concerned about these provisions.
Of course, they are hugely problematic, but they are particularly problematic given that it is envisaged that young people aged 14 and older should be able to be targeted by political parties when they cannot even vote, as we have heard. This would appear to contravene one of the basic principles of data protection law: that you should not process more personal data than you need for your purposes. If an individual cannot vote, it is hard to see how targeting them with material relating to an election is a proportionate interference with their privacy rights, particularly when they are a child. The question is, should we be soliciting support from 14 to 17 year-olds during elections when they do not have votes? Why do the rules need changing so that people can be targeted online without having consented? One of the consequences of these changes would be to allow a Government to switch off—the words used by Sir John Whittingdale—direct marketing rules in the run-up to an election, allowing candidates and parties to rely on “soft” opt-in to process data and make other changes without scrutiny.
Exactly as the noble Baroness, Lady Jones, said, respondents to the original consultation on the Bill wanted political communications to be covered by existing rules on direct marketing. Responses were very mixed on the soft opt-in, and there were worries that people might be encouraged to part with more of their personal data. More broadly, why are the Government changing the rules on democratic engagement if they say they will not use these powers? What assessment have they made of the impact of the use of the powers? Why are the powers not being overseen by the Electoral Commission? If anybody is going to have the power to introduce the ability to market directly to voters, it should be the Electoral Commission.
All this smacks of taking advantage of financial asymmetry. We talked about competition asymmetry with big tech when we debated the digital markets Bill; similarly, this seems a rather sneaky way of taking advantage of the financial resources one party might have versus others. It would allow it to do things other parties cannot, because it has granted itself permission to do that. The provisions should not be in the hands of any Secretary of State or governing party; if anything, they should be in entirely independent hands; but, even then, they are undesirable.
My Lords, I thank the noble Baroness, Lady Jones, for tabling her amendments. Amendment 19 would remove processing which is necessary for the purposes of democratic engagement from the list of recognised legitimate interests. It is essential in a healthy democracy that registered political parties, elected representatives and permitted participants in referendums can engage freely with the electorate without being impeded unnecessarily by data protection legislation.
The provisions in the Bill will mean that these individuals and organisations do not have to carry out legitimate interest assessments or look for a separate legal basis. They will, however, still need to comply with other requirements of data protection legislation, such as the data protection principles and the requirement for processing to be necessary.
On the question posed by the noble Baroness about the term “democratic engagement”, it is intended to cover a wide range of political activities inside and outside election periods. These include but are not limited to democratic representation; communicating with electors and interested parties; surveying and opinion gathering; campaigning activities; activities to increase voter turnout; supporting the work of elected representatives, prospective candidates and official candidates; and fundraising to support any of these activities. This is reflected in the drafting, which incorporates these concepts in the definition of democratic engagement and democratic engagement activities.
The ICO already has guidance on the use of personal data by political parties for campaigning purposes, which the Government anticipate it will update to reflect the changes in the Bill. We will of course work with the ICO to make sure it is familiar with our plans for commencement and that it does not benefit any party over another.
On the point made about the appropriate age for the provisions, in some parts of the UK the voting age is 16 for some elections, and children can join the electoral register as attainers at 14. The age of 14 reflects the variations in voting age across the nation; in some parts of the UK, such as Scotland, a person can register to vote at 14 as an attainer. An attainer is someone who is registered to vote in advance of their being able to do so, to allow them to be on the electoral roll as soon as they turn the required age. Children aged 14 and over are often politically engaged and are approaching voting age. The Government consider it important that political parties and elected representatives can engage freely with this age group—
I am interested in what the Minister says about the age of attainers. Surely it would be possible to remove attainers from those who could be subject to direct marketing. Given how young attainers could be, it would protect them from the unwarranted attentions of campaigning parties and so on. I do not see that as a great difficulty.
Indeed. It is certainly worth looking at, but I remind noble Lords that such communications have to be necessary, and the test of their being necessary for someone of that age is obviously more stringent.
But what is the test of necessity at that age?
The processor has to determine whether it is necessary to the desired democratic engagement outcome to communicate with someone at that age. But I take the point: for the vast majority of democratic engagement communications, 14 would be far too young to make that a worthwhile or necessary activity.
As I recall, the ages are on the electoral register.
I am not aware one way or the other, but I will happily look into that to see what further safeguards we can add so that we are not bombarding people who are too young with this material.
May I make a suggestion to my noble friend the Minister? It might be worth asking the legal people to get the right wording, but if there are different ages at which people can vote in different parts of the United Kingdom, surely it would be easier just to relate it to the age at which they are able to vote in those elections. That would address a lot of the concerns that many noble Lords are expressing here today.
I agree with the noble Baroness, but with one rider. We will keep coming back to the need for children to have a higher level of data protection than adults, and this is but one of many examples we will debate. However, I agree with her underlying point. The reason why I support removing both these clauses is the hubris of believing that you will engage the electorate by bombarding them with things they did not ask to receive.
A fair number of points were made there. I will look at ages under 16 and see what further steps, in addition to being necessary and proportionate, we can think about to provide some reassurance. Guidance would need to be in effect before any of this is acted on by any of the political parties. I and my fellow Ministers will continue to work with the ICO—
I am sorry to press the Minister, but does the Bill state that guidance will be in place before this comes into effect?
I am not sure whether it is written in the Bill. I will check, but the Bill would not function without the existence of the guidance.
I am sorry to drag this out but, on the guidance, can we be assured that the Minister will involve the Electoral Commission? It has a great deal of experience here; in fact, it has opined in the past on votes for younger cohorts of the population. It seems highly relevant to seek out its experience and the benefits of that.
I would of course be very happy to continue to engage with the Electoral Commission.
We will continue to work with the ICO to make sure that it is familiar with the plans for commencement and that its plans for guidance fit into that. In parts of the UK where the voting age is 18 and the age of attainment is 16, it would be more difficult for candidates and parties to show that it was necessary or proportionate to process the personal data of 14 and 15 year-olds in reliance on the new lawful ground. In this context, creating an arbitrary distinction between children at or approaching voting age and adults may not be appropriate; in particular, many teenagers approaching voting age may be more politically engaged than some adults. These measures will give parties and candidates a clear lawful ground for engaging them in the process. Accepting this amendment would remove the benefits of greater ease of identification of a lawful ground for processing by elected representatives, candidates and registered political parties, which is designed to improve engagement with the electorate. I therefore hope that the noble Baroness, Lady Jones, will withdraw her amendment.
I now come to the clause stand part notice that would remove Clause 114, which gives the Secretary of State a power to make exceptions to the direct marketing rules for communications sent for the purposes of democratic engagement. As Clause 115 defines terms for the purposes of Clause 114, the noble Baroness, Lady Jones, is also seeking for that clause to be removed. Under the current law, many of the rules applying to electronic communications sent for commercial marketing apply to messages sent by registered political parties, elected representatives and others for the purposes of democratic engagement. It is conceivable that, after considering the risks and benefits, a future Government might want to treat communications sent for the purposes of democratic engagement differently from commercial marketing. For example, in areas where voter turnout is particularly low or there is a need to increase engagement with the electoral process, a future Government might decide that the direct marketing rules should be modified. This clause stand part notice would remove that option.
We have incorporated several safeguards that must be met prior to regulations being laid under this clause. They include the Secretary of State having specific regard to the effect the exceptions could have on an individual’s privacy; a requirement to consult the Information Commissioner and other interested parties, as the Secretary of State considers appropriate; and the regulations being subject to parliamentary approval via the affirmative procedure.
For these reasons, I hope that the noble Baroness will agree to withdraw or not press her amendments.
My Lords, I am pleased that I have sparked such a lively debate. When I tabled these amendments, it was only me and the noble Lord, Lord Clement-Jones, so I thought, “This could be a bit sad, really”, but it has not been. Actually, it has been an excellent debate and we have identified some really good issues.
As a number of noble Lords said, the expression “democratic engagement” is weasel words: what is not to like about democratic engagement? We all like it. Only when you drill down into the proposals do you realise the traps that could befall us. As noble Lords and the noble Baroness, Lady Bennett, rightly said, we have to see this in the context of some of the other moves the Government are pursuing in trying to skew the electoral rules in their favour. I am not convinced that this is as saintly as the Government are trying to pretend.
The noble Baroness, Lady Harding, is absolutely right: this is about trust. It is about us setting an example. Of all the things we can do on data protection that we have control over, we could at least show the electorate how things could be done, so that they realise that we, as politicians, understand how precious their data is and that we do not want to misuse it.
I hope we have all knocked on doors, and I must say that I have never had a problem engaging with the electorate, and actually they have never had a problem engaging with us. This is not filling a gap that anybody has identified. We are all out there and finding ways of communicating that, by and large, I would say the electorate finds perfectly acceptable. People talk to us, and they get the briefings through the door. That is what they expect an election campaign to be about. They do not expect, as the noble Baroness, Lady Harding, said, to go to see their MP about one thing and then suddenly find that they are being sent information about something completely different or that assumptions are being made about them which were never the intention when they gave the information in the first place. I just feel that there is something slightly seedy about all this. I am sorry that the Minister did not pick up a little more on our concerns about all this.
There are some practical things that I think it was helpful for us to have talked about, such as the Electoral Commission. I do not think that it has been involved up to now. I would like to know in more detail what its views are on all this. It is also important that we come back to the Information Commissioner and check in more detail what his view is on all this. It would be nice to have guidance, but I do not think that that will be enough to satisfy us in terms of how we proceed with these amendments.
The Minister ultimately has not explained why this has been introduced at this late stage. He is talking about this as though conceivably, in the future, a Government might want to adopt these rules. If that is the case, I respectfully say that we should come back at that time with a proper set of proposals that go right through the democratic process that we have here in Parliament, scrutinise it properly and make a decision then, rather than being bounced into something at a very late stage.
I have to say that I am deeply unhappy at what the Minister has said. I will obviously look at Hansard, but I may well want to return to this.
My Lords, I rise to speak to a series of minor and technical, yet necessary, government amendments which, overall, improve the functionality of the Bill. I hope the Committee will be content if I address them together. Amendments 20, 42, 61 and 63 are minor technical amendments to references to special category data in Clauses 6 and 14. All are intended to clarify that references to special category data mean references to the scope of Article 9(1) of the UK GDPR. They are simply designed to improve the clarity of the drafting.
I turn now to the series of amendments that clarify how time periods within the data protection legal framework are calculated. For the record, these are Amendments 136, 139, 141, 149, 151, 152, 176, 198, 206 to 208, 212 to 214, 216, 217, 253 and 285. Noble Lords will be aware that the data protection legislation sets a number of time periods or deadlines for certain things to happen, such as responding to subject access requests; in other words, at what day, minute or hour the clock starts and stops ticking in relation to a particular procedure. The Data Protection Act 2018 expressly applies the EU-derived rules on how these time periods should be calculated, except in a few incidences where it is more appropriate for the UK domestic approach to apply, for example time periods related to parliamentary procedures. I shall refer to these EU-derived rules as the time periods regulation.
In response to the Retained EU Law (Revocation and Reform) Act 2023, we are making it clear that the time periods regulation continues to apply to the UK GDPR and other regulations that form part of the UK’s data protection and privacy framework, for example, the Privacy and Electronic Communications (EC Directive) Regulations 2003. By making such express provision, our aim is to ensure consistency and continuity and to provide certainty for organisations, individuals and the regulator. We have also made some minor changes to existing clauses in the Bill to ensure that application of the time periods regulation achieves the correct effect.
Secondly, Amendment 197 clarifies that the requirement to consult before making regulations that introduce smart data schemes may be satisfied by a consultation before the Bill comes into force. The regulations must also be subject to affirmative parliamentary scrutiny to allow Members of both Houses to scrutinise legislation. This will facilitate the rapid implementation of smart data schemes, so that consumers and businesses can start benefiting as soon as possible. The Government are committed to working closely with business and wider stakeholders in the development of smart data.
Furthermore, Clause 96(3) protects data holders from the levy that may be imposed to meet the expenses of persons and bodies performing functions under smart data regulations. This levy cannot be imposed on data holders that do not appear capable of being directly affected by the exercise of those functions.
Amendment 196 extends that protection to authorised persons and third-party recipients on whom the levy may also be imposed. Customers will not have to pay to access their data, only for the innovative services offered by third parties. We expect that smart data schemes will deliver significant time and cost savings for customers.
The Government are committed to balancing the incentives for businesses to innovate and provide smart data services with ensuring that all customers are empowered through their data use and do not face undue financial barriers or digital exclusion. Any regulations providing for payment of the levy or fees will be subject to consultation and to the affirmative resolution procedure in Parliament.
Amendments 283 and 285 to Schedule 15 confer a general incidental power on the information commission. It will have the implied power to do things incidental to or consequential upon the exercise of its functions, for example, to hold land and enter into agreements. This amendment makes those implicit powers explicit for the avoidance of doubt and in line with standard practice. It does not give the commission substantive new powers. I beg to move.
My Lords, I know that these amendments were said to be technical amendments, so I thought I would just accept them, but when I saw the wording of Amendment 283 some alarm bells started ringing. It says:
“The Commission may do anything it thinks appropriate for the purposes of, or in connection with, its functions”.
I know that the Minister said that this is stating what the commission is already able to do, but I am concerned whenever I see those words anywhere. They give a blank cheque to any authority or organisation.
Many noble Lords will know that I have previously spoken about the principal-agent theory in politics, in which certain powers are delegated to an agency or regulator, but what accountability does it have? I worry when I see that it “may do anything … appropriate” to fulfil its tasks. I would like some assurance from the Minister that there is a limit to what the information commission can do and some accountability. At a time when many of us are asking who regulates the regulators and when we are looking at some of the arm’s-length bodies—need I mention the Post Office?—there is some real concern about accountability.
I understand the reason for wanting to clarify or formalise what the Minister believes the information commission is doing already, but I worry about this form of words. I would like some reassurance that it is not wide-ranging and that there is some limit and accountability to future Governments. I have seen this sentiment across the House; people are asking who regulates the regulators and to whom are they accountable.
My Lords, I have been through this large group and, apart from my natural suspicion that there might be something dastardly hidden away in it, I am broadly content, but I have a few questions.
On Amendment 20, can the Minister conform that the new words “further processing” have the same meaning as the reuse of personal data? Can he confirm that Article 5(1)(b) will prohibit this further processing when it is not in line with the original purpose for which the data was collected? How will the data subject know that is the case?
On Amendment 196, to my untutored eye it looks like the regulation-making power is being extended away from the data holder to include authorised persons and third-party recipients. My questions are simple enough: was this an oversight on the part of the original drafters of that clause? Is the amendment an extension of those captured by the effect of the clause? Is it designed to achieve consistency across the Bill? Finally, can I assume that an authorised person or third party would usually be someone acting on behalf of an agent of the data holder?
I presume that Amendments 198, 212 and 213 are needed because of a glitch in the drafting—similarly with Amendment 206. I can see that Amendments 208, 216 and 217 clarify when time periods begin, but why are the Government seeking to disapply time periods in Amendment 253 when surely some consistency is required?
Finally—I am sure the Minister will be happy about this—I am all in favour of flexibility, but Amendment 283 states that the Information Commissioner has the power to do things to facilitate the exercise of his functions. The noble Lord, Lord Kamall, picked up on this. We need to understand what those limits are. On the face of it, one might say that the amendment is sensible, but it seems rather general and broad in its application. As the noble Lord, Lord Kamall, rightly said, we need to see what the limits of accountability are. This is one of those occasions.
I thank the noble Lords, Lord Kamall and Lord Bassam, for their engagement with this group. On the questions from the noble Lord, Lord Kamall, these are powers that the ICO would already have in common law. As I am given to understand is now best practice, they are put on a statutory footing in the Bill as part of best practice with all Bills. The purpose is to align with best practice. It does not confer substantial new powers but clarifies the powers that the regulator has. I can also confirm that the ICO was and remains accountable to Parliament.
I am sorry to intervene as I know that noble Lords want to move on to other groups, but the Minister said that the ICO remains accountable to Parliament. Will he clarify how it is accountable to Parliament for the record?
The Information Commissioner is directly accountable to Parliament in that he makes regular appearances in front of Select Committees that scrutinise the regulator’s work, including progress against objectives.
The noble Lord, Lord Bassam, made multiple important and interesting points. I hope he will forgive me if I undertake to write to him about those; there is quite a range of topics to cover. If there are any on which he requires answers right away, he is welcome to intervene.
I want to be helpful to the Minister. I appreciate that these questions are probably irritating but I carefully read through the amendments and aligned them with the Explanatory Notes. I just wanted some clarification to make sure that we are clear on exactly what the Government are trying to do. “Minor and technical” covers a multitude of sins; I know that from my own time as a Minister.
Indeed. I will make absolutely sure that we provide a full answer. By the way, I sincerely thank the noble Lord for taking the time to go through what is perhaps not the most rewarding of reads but is useful none the less.
On the question of the ICO being responsible to Parliament, in the then Online Safety Bill and the digital markets Bill we consistently asked for regulators to be directly responsible to Parliament. If that is something the Government believe they are, we would like to see an expression of it.
I would be happy to provide such an expression. I will be astonished if that is not the subject of a later group of amendments. I have not yet prepared for that group, I am afraid, but yes, that is the intention.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. He raised even more questions about Clause 9 than I ever dreamed of. He has illustrated the real issues behind the clause and why it is so important to debate its standing part, because, in our view, it should certainly be removed from the Bill. It would seriously limit people’s ability to access information about how their personal data is collected and used. We are back to the dilution of data subject rights, within which the rights of data subject access are, of course, vital. This includes limiting access to information about automated decision-making processes to which people are subject.
A data subject is someone who can be identified directly or indirectly by personal data, such as a name, an ID number, location data, or information relating to their physical, economic, cultural or social identity. Under existing law, data subjects have a right to request confirmation of whether their personal data is being processed by a controller, to access that personal data and to obtain information about how it is being processed. The noble Lord, Lord Sikka, pointed out that there is ample precedent for how the controller can refuse a request from a data subject only if it is manifestly unfounded or excessive. The meaning of that phrase is well established.
There are three main ways in which Clause 9 limits people’s ability to access information about how their personal data is being collected and used. First, it would lower the threshold for refusing a request from “manifestly unfounded or excessive” to “vexatious or excessive”. This is an inappropriately low threshold, given the nature of a data subject access request—namely, a request by an individual for their own data.
Secondly, Clause 9 would insert a new mandatory list of considerations for deciding whether the request is vexatious or excessive. This includes vague considerations, such as
“the relationship between the person making the request (the ‘sender’) and the person receiving it (the ‘recipient’)”.
The very fact that the recipient holds data relating to the sender means that there is already some form of relationship between them.
Thirdly, the weakening of an individual’s right to obtain information about how their data is being collected, used or shared is particularly troubling given the simultaneous effect of the provisions in Clause 10, which means that data subjects are less likely to be informed about how their data is being used for additional purposes other than those for which it was originally collected, in cases where the additional purposes are for scientific or historical research, archiving in the public interest or statistical purposes. Together, the two clauses mean that an individual is less likely to be proactively told how their data is being used, while it is harder to access information about their data when requested.
In the Public Bill Committee in the House of Commons, the Minister, Sir John Whittingdale, claimed that:
“The new parameters are not intended to be reasons for refusal”,
but rather to give
“greater clarity than there has previously been”.—[Official Report, Commons, Data Protection and Digital Information Bill Committee, 16/5/23; cols. 113-14.]
But it was pointed out by Dr Jeni Tennison of Connected by Data in her oral evidence to the committee that the impact assessment for the Bill indicates that a significant proportion of the savings predicted would come from lighter burdens on organisations dealing with subject access requests as a result of this clause. This suggests that, while the Government claim that this clause is a clarification, it is intended to weaken obligations on controllers and, correspondingly, the rights of data subjects. Is that where the Secretary of State’s £10 billion of benefit from this Bill comes from? On these grounds alone, Clause 9 should be removed from the Bill.
We also oppose the question that Clause 12 stand part of the Bill. Clause 12 provides that, in responding to subject access requests, controllers are required only to undertake a
“reasonable and proportionate search for the personal data and other information”.
This clause also appears designed to weaken the right of subject access and will lead to confusion for organisations about what constitutes a reasonable and proportionate search in a particular circumstance. The right of subject access is central to individuals’ fundamental rights and freedoms, because it is a gateway to exercising other rights, either within the data subject rights regime or in relation to other legal rights, such as the rights to equality and non-discrimination. Again, the lowering of rights compared with the EU creates obvious risks, and this is a continuing theme of data adequacy.
Clause 12 does not provide a definition for reasonable and proportionate searches, but when introducing the amendment, Sir John Whittingdale suggested that a search for information may become unreasonable or disproportionate
“when the information is of low importance or of low relevance to the data subject”.—[Official Report, Commons, 29/11/23; col. 873.]
Those considerations diverge from those provided in the Information Commissioner’s guidance on the rights of access, which states that when determining whether searches may be unreasonable or disproportionate, the data controller must consider the circumstances of the request, any difficulties involved in finding the information and the fundamental nature of the right of access.
We also continue to be concerned about the impact assessment for the Bill and the Government’s claims that the new provisions in relation to subject access requests are for clarification only. Again, Clause 12 appears to have the same impact as Clause 9 in the kinds of savings that the Government seem to imagine will emerge from the lowering of subject access rights. This is a clear dilution of subject access rights, and this clause should also be removed from the Bill.
We always allow for belt and braces and if our urging does not lead to the Minister agreeing to remove Clauses 9 and 12, at the very least we should have the new provisions set out either in Amendment 26, in the name of the noble Baroness, Lady Jones of Whitchurch, or in Amendment 25, which proposes that a data controller who refuses a subject access request must give reasons for their refusal and tell the subject about their right to seek a remedy. That is absolutely the bare minimum, but I would far prefer to see the deletion of Clauses 9 and 12 from the Bill.
As ever, I thank noble Lords for raising and speaking to these amendments. I start with the stand part notices on Clauses 9 and 36, introduced by the noble Lord, Lord Clement-Jones. Clauses 9 and 36 clarify the new threshold to refuse or charge a reasonable fee for a request that is “vexatious or excessive”. Clause 36 also clarifies that the Information Commissioner may charge a fee for dealing with, or refuse to deal with, a vexatious or excessive request made by any persons and not just data subjects, providing necessary certainty.
I apologise for intervening, but the Minister referred to resources. By that, he means the resources for the controller but, as I said earlier, there is no consideration of what the social cost may be. If this Bill had already become law, how would the victims of the Post Office scandal have been able to secure any information? Under this Bill, the threshold for providing information will be much lower than it is under the current legislation. Can the Minister say something about how the controllers will take social cost into account or how the Government have taken that into account?
First, on the point made by the noble Lord, Lord Bassam, it is not to be argumentative—I am sure that there is much discussion to be had—but the intention is absolutely not to lower the standard for a well-intended request.
Sadly, a number of requests that are not well intended are made, with purposes of cynicism and an aim to disrupt. I can give a few examples. For instance, some requests are deliberately made with minimal time between them. Some are made to circumvent the process of legal disclosure in a trial. Some are made for other reasons designed to disrupt an organisation. The intent of using “vexatious” is not in any way to reduce well-founded, or even partially well-founded, attempts to secure information; it is to reduce less desirable, more cynical attempts to work in this way.
But the two terms have a different legal meaning, surely.
The actual application of the terms will be set out in guidance by the ICO but the intention is to filter out the more disruptive and cynical ones. Designing these words is never an easy thing but there has been considerable consultation on this in order to achieve that intention.
My Lords—sorry; it may be that the Minister was just about to answer my question. I will let him do so.
I will have to go back to the impact assessment but I would be astonished if that was a significant part of the savings promised. By the way, the £10.6 billion—or whatever it is—in savings was given a green rating by the body that assesses these things; its name eludes me. It is a robust calculation. I will check and write to the noble Lord, but I do not believe that a significant part of that calculation leans on the difference between “vexatious” and “manifestly unfounded”.
It would be very useful to have the Minister respond on that but, of course, as far as the impact assessment is concerned, a lot of this depends on the Government’s own estimates of what this Bill will produce—some of which are somewhat optimistic.
The noble Baroness, Lady Jones, has given me an idea: if an impact assessment has been made, clause by clause, it would be extremely interesting to know just where the Government believe the golden goose is.
I am not quite sure what is being requested because the impact assessment has been not only made but published.
I see—so noble Lords would like an analysis of the different components of the impact assessment. It has been green-rated by the independent Regulatory Policy Committee. I have just been informed by the Box that the savings from these reforms to the wording of SARs are valued at less than 1% of the benefit of more than £10 billion that this Bill will bring.
That begs the question of where on earth the rest is coming from.
Which I will be delighted to answer. With this interesting exchange, I have lost in my mind the specific questions that the noble Lord, Lord Sikka, asked but I am coming on to some of his other ones; if I do not give satisfactory answers, no doubt he will intervene and ask again.
I appreciate the further comments made by the noble Lord, Lord Sikka, about the Freedom of Information Act. I hope he will be relieved to know that this Bill does nothing to amend that Act. On his accounting questions, he will be aware that most SARs are made by private individuals to private companies. The Government are therefore not involved in that process and do not collect the kind of information that he described.
Following the DPDI Bill, the Government will work with the ICO to update guidance on subject access requests. Guidance plays an important role in clarifying what a controller should consider when relying on the new “vexatious or excessive” provision. The Government are also exploring whether a code of practice on subject access requests can best address the needs of controllers and data subjects.
On whether Clause 12 should stand part of the Bill, Clause 12 is only putting on a statutory footing what has already been established—
My apologies. The Minister just said that the Government do not collect the data. Therefore, what is the basis for changing the threshold? No data, no reasonable case.
The Government do not collect details of private interactions between those raising SARs and the companies they raise them with. The business case is based on extensive consultation—
I hope that the Government have some data about government departments and the public bodies over which they have influence. Can he provide us with a glimpse of how many requests are received, how many are rejected at the outset, how many go to the commissioners, what the cost is and how the cost is computed? At the moment, it sounds like the Government want to lower the threshold without any justification.
As I say, I do not accept that the threshold is being lowered. On the other hand, I will undertake to find out what information can be reasonably provided. Again, as I said, the independent regulatory committee gave the business case set out a green rating; that is a high standard and gives credibility to the business case calculations, which I will share.
The reforms keep reasonable requests free of charge and instead seek to ensure that controllers can refuse or charge a reasonable fee for requests that are “vexatious or excessive”, which can consume a significant amount of time and resources. However, the scope of the current provision is unclear and, as I said, there are a variety of circumstances where controllers would benefit from being able confidently to refuse or charge the fee.
The Minister used the phrase “reasonable fee”. Can he provide some clues on that, especially for the people who may request information? We have around 17.8 million individuals living on less than £12,570. So, from what perspective is the fee reasonable and how is it determined?
“Reasonable” would be set out in the guidance to be created by the ICO but it would need to reflect the costs and affordability. The right of access remains of paramount importance in the data protection framework.
Lastly, as I said before on EU data adequacy, the Government maintain an ongoing dialogue with the EU and believe that our reforms are compatible with maintaining our data adequacy decisions.
For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore agree to withdraw or not press them.
My Lords, I can also be relatively brief. I thank all noble Lords who have spoken and the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones, for their amendments, to many of which I have added my name.
At the heart of this debate is what constitutes a disproportionate or impossibility exemption for providing data to individuals when the data is not collected directly from data subjects. Amendments 29 to 33 provide further clarity on how exemptions on the grounds of disproportionate effort should be interpreted —for example, by taking into account whether there would be a limited impact on individuals, whether they would be caused any distress, what the exemptions were in the first place and whether the information had been made publicly available by a public body. All these provide some helpful context, which I hope the Minister will take on board.
I have also added my name to Amendments 27 and 28 from the noble Baroness, Lady Harding. They address the particular concerns about those using the open electoral register for direct marketing purposes. As the noble Baroness explained, the need for this amendment arises from the legal ruling that companies using the OER must first notify individuals at their postal addresses whenever their data is being used. As has been said, given that individuals already have an opt-out when they register on the electoral roll, it would seem unnecessary and impractical for companies using the register to follow up with individuals each time they want to access their data. These amendments seek to close that loophole and return the arrangements back to the previous incarnation, which seemed to work well.
All the amendments provide useful forms of words but, as the noble Baroness, Lady Harding, said, if the wording is not quite right, we hope that the Minister will help us to craft something that is right and that solves the problem. I hope that he agrees that there is a useful job of work to be done on this and that he provides some guidance on how to go about it.
I thank my noble friend Lady Harding for moving this important amendment. I also thank the cosignatories—the noble Lords, Lord Clement-Jones and Lord Black, and the noble Baroness, Lady Jones. As per my noble friend’s request, I acknowledge the importance of this measure and the difficulty of judging it quite right. It is a difficult balance and I will do my best to provide some reassurance, but I welcomed hearing the wise words of all those who spoke.
I turn first to the clarifying Amendments 27 and 32. I reassure my noble friend Lady Harding that, in my view, neither is necessary. Clause 11 amends the drafting of the list of cases when the exemption under Article 14(5) applies but the list closes with “or”, which makes it clear that you need to meet only one of the criteria listed in paragraph (5) to be exempt from the transparency requirements.
I turn now to Amendments 28 to 34, which collectively aim to expand the grounds of disproportionate effort to exempt controllers from providing certain information to individuals. The Government support the use of public data sources, such as the OER, which may be helpful for innovation and may have economic benefits. Sometimes, providing this information is simply not possible or is disproportionate. Existing exemptions apply when the data subject already has the information or in cases where personal data has been obtained from someone other than the data subject and it would be impossible to provide the information or disproportionate effort would be required to do so.
We must strike the right balance between supporting the use of these datasets and ensuring transparency for data subjects. We also want to be careful about protecting the integrity of the electoral register, open or closed, to ensure that it is used within the data subject’s reasonable expectations. The exemptions that apply when the data subject already has the information or when there would be a disproportionate effort in providing the information must be assessed on a case-by-case basis, particularly if personal data from public registers is to be combined with other sources of personal data to build a profile for direct marketing.
These amendments may infringe on transparency—a key principle in the data protection framework. The right to receive information about what is happening to your data is important for exercising other rights, such as the right to object. This could be seen as going beyond what individuals might expect to happen to their data.
The Government are not currently convinced that these amendments would be sufficient to prevent negative consequences to data subject rights and confidence in the open electoral register and other public registers, given the combination of data from various sources to build a profile—that was the subject of the tribunal case being referenced. Furthermore, the Government’s view is that there is no need to amend Article 14(6) explicitly to include the “reasonable expectation of the data subjects” as the drafting already includes reference to “appropriate safeguards”. This, in conjunction with the fairness principle, means that data controllers are already required to take this into account when applying the disproportionate effort exemption.
The above notwithstanding, the Government understand that the ICO may explore this question as part of its work on guidance in the future. That seems a better way of addressing this issue in the first instance, ensuring the right balance between the use of the open electoral register and the rights of data subjects. We will continue to work closely with the relevant stakeholders involved and monitor the situation.
I wonder whether I heard my noble friend correctly. He said “may”, “could” and “not currently convinced” several times, but, for the companies concerned, there is a very real, near and present deadline. How is my noble friend the Minister suggesting that deadline should be considered?
On the first point, I used the words carefully because the Government cannot instruct the ICO specifically on how to act in any of these cases. The question about the May deadline is important. With the best will in the world, none of the provisions in the Bill are likely to be in effect by the time of that deadline in any case. That being the case, I would feel slightly uneasy about advising the ICO on how to act.
My Lords, I am not quite getting from the Minister whether he has an understanding of and sympathy with the case that is being made or whether he is standing on ceremony on its legalities. Is he saying, “No, we think that would be going too far”, or that there is a good case and that guidance or some action by the ICO would be more appropriate? I do not get the feeling that somebody has made a decision about the policy on this. It may be that conversations with the Minister between Committee and Report would be useful, and it may be early days yet until he hears the arguments made in Committee; I do not know, but it would be useful to get an indication from him.
Yes. I repeat that I very much recognise the seriousness of the case. There is a balance to be drawn here. In my view, the best way to identify the most appropriate balancing point is to continue to work closely with the ICO, because I strongly suspect that, at least at this stage, it may be very difficult to draw a legislative dividing line that balances the conflicting needs. That said, I am happy to continue to engage with noble Lords on this really important issue between Committee and Report, and I commit to doing so.
On the question of whether Clause 11 should stand part of the Bill, Clause 11 extends the existing disproportionate effort exemption to cases where the controller collected the personal data directly from the data subject and intends to carry out further processing for research purposes, subject to the research safeguards outlined in Clause 26. This exemption is important to ensure that life-saving research can continue unimpeded.
Research holds a privileged position in the data protection framework because, by its nature, it is viewed as generally being in the public interest. The framework has various exemptions in place to facilitate and encourage research in the UK. During the consultation, we were informed of various longitudinal studies, such as those into degenerative neurological conditions, where it is impossible or nearly impossible to recontact data subjects. To ensure that this vital research can continue unimpeded, Clause 11 provides a limited exemption that applies only to researchers who are complying with the safeguards set out in Clause 26.
The noble Lord, Lord Clement-Jones, raised concerns that Clause 11 would allow unfair processing. I assure him that this is not the case, as any processing that uses the disproportionate effort exemption in Article 13 must comply with the overarching data protection principles, including lawfulness, fairness and transparency, so that even if data controllers rely on this exemption they should consider other ways to make the processing they undertake as fair and transparent as possible.
Finally, returning to EU data adequacy, the Government recognise its importance and, as I said earlier, are confident that the proposals in Clause 11 are complemented by robust safeguards, which reinforces our view that they are compatible with EU adequacy. For the reasons that I have set out, I am unable to accept these amendments, and I hope that noble Lords will not press them.
My Lords, I am not quite sure that I understand where my noble friend the Minister is on this issue. The noble Lord, Lord Clement-Jones, summed it up well in his recent intervention. I will try to take at face value my noble friend’s assurances that he is happy to continue to engage with us on these issues, but I worry that he sees this as two sides of an issue—I hear from him that there may be some issues and there could be some problems—whereas we on all sides of the Committee have set out a clear black and white problem. I do not think they are the same thing.
I appreciate that the wording might create some unintended consequences, but I have not really understood what my noble friend’s real concerns are, so we will need to come back to this on Report. If anything, this debate has made it even clearer to me that it is worth pushing for clarity on this. I look forward to ongoing discussions with a cross-section of noble Lords, my noble friend and the ICO to see if we can find a way through to resolve the very real issues that we have identified today. With that, and with thanks to all who have spoken in this debate, I beg leave to withdraw my amendment.
As ever, I thank the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, for their detailed consideration of Clause 14, and all other noble Lord who spoke so well. I carefully note the references to the DWP’s measure on fraud and error. For now, I reassure noble Lords that a human will always be involved in all decision-making relating to that measure, but I note that this Committee will have a further debate specifically on that measure later.
The Government recognise the importance of solely automated decision-making to the UK’s future success and productivity. These reforms ensure that it can be responsibly implemented, while any such decisions with legal or similarly significant effects have the appropriate safeguards in place, including the rights to request a review and to request one from a human. These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles. In doing so, they will provide confidence to organisations looking to use these technologies in a responsible way while driving economic growth and innovation.
The Government also recognise that AI presents huge opportunities for the public sector. It is important that AI is used responsibly and transparently in the public sector; we are already taking steps to build trust and transparency. Following a successful pilot, we are making the Algorithmic Transparency Reporting Standard—the ATRS—a requirement for all government departments, with plans to expand this across the broader public sector over time. This will ensure that there is a standardised way for government departments proactively to publish information about how and why they are using algorithms in their decision-making. In addition, the Central Digital and Data Office—the CDDO—has already published guidance on the procurement and use of generative AI for the UK Government and, later this year, DSIT will launch the AI management essentials scheme, setting a minimum good practice standard for companies selling AI products and services.
My Lords, could I just interrupt the Minister? It may be that he can get an answer from the Box to my question. One intriguing aspect is that, as the Minister said, the pledge is to bring the algorithmic recording standard into each government department and there will be an obligation to use that standard. However, what compliance mechanism will there be to ensure that that is happening? Does the accountable Permanent Secretary have a duty to make sure that that is embedded in the department? Who has the responsibility for that?
That is a fair question. I must confess that I do not know the answer. There will be mechanisms in place, department by department, I imagine, but one would also need to report on it across government. Either it will magically appear in my answer or I will write to the Committee.
The CDDO has already published guidance on the procurement and use of generative AI for the Government. We will consult on introducing this as a mandatory requirement for public sector procurement, using purchasing power to drive responsible innovation in the broader economy.
I turn to the amendments in relation to meaningful involvement. I will first take together Amendments 36 and 37, which aim to clarify that the safeguards mentioned under Clause 14 are applicable to profiling operations. New Article 22A(2) already clearly sets out that, in cases where profiling activity has formed part of the decision-making process, controllers have to consider the extent to which a decision about an individual has been taken by means of profiling when establishing whether human involvement has been meaningful. Clause 14 makes clear that a solely automated significant decision is one without meaningful human involvement and that, in these cases, controllers are required to provide the safeguards in new Article 22C. As such, we do not believe that these amendments are necessary; I therefore ask the noble Baroness, Lady Jones, not to press them.
Turning to Amendment 38, the Government are confident that the existing reference to “data subject” already captures the intent of this amendment. The existing definition of “personal data” makes it clear that a data subject is a person who can be identified, directly or indirectly. As such, we do not believe that this amendment is necessary; I ask the noble Lord, Lord Clement-Jones, whether he would be willing not to press it.
Amendments 38A and 40 seek to clarify that, for human involvement to be considered meaningful, the review must be carried out by a competent person. We feel that these amendments are unnecessary as meaningful human involvement may vary depending on the use case and context. The reformed clause already introduces a power for the Secretary of State to provide legal clarity on what is or is not to be taken as meaningful human involvement. This power is subject to the affirmative procedure in Parliament and allows the provision to be future-proofed in the wake of technological advances. As such, I ask the noble Baronesses, Lady Jones and Lady Bennett, not to press their amendments.
I am not sure I agree with that characterisation. The ATRS is a relatively new development. It needs time to bed in and needs to be bedded in on an agile basis in order to ensure not only quality but speed of implementation. That said, I ask the noble Lord to withdraw his amendment.
The Minister has taken us through what Clause 14 does and rebutted the need for anything other than “solely”. He has gone through the sensitive data and the special category data aspects, and so on, but is he reiterating his view that this clause is purely for clarification; or is he saying that it allows greater use of automated decision-making, in particular in public services, so that greater efficiencies can be found and therefore it is freeing up the public sector at the expense of the rights of the individual? Where does he sit in all this?
As I said, the intent of the Government is: yes to more automated data processing to take advantage of emerging technologies, but also yes to maintaining appropriate safeguards. The safeguards in the present system consist—if I may characterise it in a slightly blunt way—of providing quite a lot of uncertainty, so that people do not take the decision to positively embrace the technology in a safe way. By bringing in this clarity, we will see an increase not only in the safety of their applications but in their use, driving up productivity in both the public and private sectors.
My Lords, I said at the outset that I thought this was the beginning of a particular debate, and I was right, looking at the amendments coming along. The theme of the debate was touched on by the noble Baroness, Lady Bennett, when she talked about these amendments, in essence, being about keeping humans in the loop and the need for them to be able to review decisions. Support for that came from the noble Baroness, Lady Kidron, who made some important points. The point the BMA made about risking eroding trust cut to what we have been talking about all afternoon: trust in these processes.
The noble Lord, Lord Clement-Jones, talked about this effectively being the watering down of Article 22A, and the need for some core ethical principles in AI use and for the Government to ensure a right to human review. Clause 14 reverses the presumption of that human reviewing process, other than where solely automated decision-making exists, where it will be more widely allowed, as the Minister argued.
However, I am not satisfied by the responses, and I do not think other Members of your Lordships’ Committee will be either. We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts; I do not mind which, but that is how it seems to me. Of course, we accept that there are huge opportunities for AI in the delivery of public services, particularly in healthcare and the operation of the welfare system, but we need to ensure that citizens in this country have a higher level of protection than the Bill currently affords them.
At one point I thought the Minister said that a solely automated decision was a rubber-stamped decision. To me, that gave the game away. I will have to read carefully what he said in Hansard¸ but that is how it sounded, and it really gets our alarm bells ringing. I am happy to withdraw my amendment, but we will come back to this subject from time to time and throughout our debates on the rest of the Bill.
My Lords, I will speak to my Amendment 48. By some quirk of fate, I failed to sign up to the amendments that the noble Lord, Lord Bassam, so cogently introduced. I would have signed up if I had realised that I had not, so to speak.
It is a pleasure to follow the noble Baroness, Lady Kidron. She has a track record of being extremely persuasive, so I hope the Minister pays heed in what happens between Committee and Report. I very much hope that there will be some room for manoeuvre and that there is not just permanent push-back, with the Minister saying that everything is about clarifying and us saying that everything is about dilution. There comes a point when we have to find some accommodation on some of these areas.
Amendments 48 and 49 are very similar—I was going to say, “Great minds think alike”, but I am not sure that my brain feels like much of a great mind at the moment. “Partly” or “predominantly” rather than “solely”, if you look at it the other way round, is really the crux of what I think many of us are concerned about. It is easy to avoid the terms of Article 22 just by slipping in some sort of token human involvement. Defining “meaningful” is so difficult in these circumstances. I am concerned that we are opening the door to something that could be avoided. Even then, the terms of the new clause—we will have a clause stand part debate on Wednesday, obviously—put all the onus on the data subject, whereas that was not the case previously under Article 22. The Minister has not really explained why that change has been made.
I conclude by saying that I very much support Amendment 41. This whole suite of amendments is well drafted. The point about the Equality Act is extremely well made. The noble Lord, Lord Holmes, also has a very good amendment here. It seems to me that involving the ICO right in the middle of this will be absolutely crucial—and we are back to public trust again. If nothing else, I would like explicitly to include that under Clause 14 in relation to Article 22 by the time this Bill goes through.
I thank noble Lords and the noble Baroness for their further detailed consideration of Clause 14.
Let me take first the amendments that deal with restrictions on and safeguards for ADM and degree of ADM. Amendment 41 aims to make clear that solely automated decisions that contravene any part of the Equality Act 2010 are prohibited. We feel that this amendment is unnecessary for two reasons. First, this is already the case under the Equality Act, which is reinforced by the lawfulness principle under the present data protection framework, meaning that controllers are already required to adhere to the Equality Act 2010. Secondly, explicitly stating in the legislation that contravening one type of legislation is prohibited—in this case, the Equality Act 2010—and not referring to other legislation that is also prohibited will lead to an inconsistent approach. As such, we do not believe that this amendment is necessary; I ask the noble Baroness, Lady Jones, to withdraw it.
Amendment 44 seeks to limit the conditions for special category data processing for this type of automated decision-making. Again, we feel that this is not needed given that a set of conditions already provides enhanced levels of protection for the processing of special category data, as set out in Article 9 of the UK GDPR. In order to lawfully process special category data, you must identify both a lawful basis under Article 6 of the UK GDPR and a separate condition for processing under Article 9. Furthermore, where an organisation seeks to process special category data under solely automated decision-making on the basis that it is necessary for contract, in addition to the Articles 6 and 9 lawful bases, they would also have to demonstrate that the processing was necessary for substantial public interest.
Similarly, Amendment 45 seeks to apply safeguards when processing special category data; however, these are not needed as the safeguards in new Article 22C already apply to all forms of processing, including the processing of special category data, by providing sufficient safeguards for data subjects’ rights, freedoms and legitimate interests. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, not to press them.
Can the Minister give me an indication of the level at which that kicks in? For example, say there is a child in a classroom and a decision has been made about their ability in a particular subject. Is it automatic that the parent and the child get some sort of read-out on that? I would be curious to know where the Government feel that possibility starts.
In that example, where a child was subject to a solely ADM decision, the school would be required to inform the child of the decision and the reasons behind it. The child and their parent would have the right to seek a human review of the decision.
We may come on to this when we get to edtech but a lot of those decisions are happening automatically right now, without any kind of review. I am curious as to why it is on the school whereas the person actually doing the processing may well be a technology company.
It may be either the controller or the processor but for any legal or similarly significant decision right now—today—there is a requirement before the Bill comes into effect. That requirement is retained by the Bill.
In line with ICO guidance, children need particular protection when organisations collect and process their personal data because they may be less aware of the risks involved. If organisations process children’s personal data they should think about the need to protect them from the outset and should design their systems and processes with this in mind. This is the case for organisations processing children’s data during solely automated decision-making, just as it is for all processing of children’s data.
Building on this, the Government’s view is that automated decision-making has an important role to play in protecting children online, for example with online content moderation. The current provisions in the Bill will help online service providers understand how they can use these technologies and strike the right balance between enabling the best use of automated decision-making technology while continuing to protect the rights of data subjects, including children. As such, we do not believe that the amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
Amendments 48 and 49 seek to extend the Article 22 provisions to “predominantly” and “partly” automated decision-making. These types of processing already involve meaningful human involvement. In such instances, other data protection requirements, including transparency and fairness, continue to apply and offer relevant protections. As such, we do not believe that these amendments are necessary; I ask the noble Baroness, Lady Jones, and the noble Lord, Lord Clement-Jones, if they would be willing not to press them.
Amendment 50 seeks to ensure that the Article 22C safeguards will apply alongside, rather than instead of, the transparency obligations in the UK GDPR. I assure the noble Baroness, Lady Jones, that the general transparency obligations in Articles 12 to 15 will continue to apply and thus will operate alongside the safeguards in the reformed Article 22. As such, we do not believe that this amendment is necessary; I ask the noble Baroness if she would be willing not to press it.
The changes proposed by Amendment 52A are unnecessary as Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons that the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22. Also, any changes to the regulations are subject to the affirmative procedure so must be approved by both Houses of Parliament. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, we do not believe that this amendment is necessary and, if he were here, I would ask my noble friend Lord Holmes if he would be willing not to press it.
Amendments 98A and 104A are related to workplace rights. Existing data protection legislation and our proposed reforms provide sufficient safeguards for automated decision making where personal data is being processed, including in workplaces. The UK’s human rights law, and existing employment and equality laws, also ensure that employees are informed and consulted about any workplace developments, which means that surveillance of employees is regulated. As such, we do not believe that these amendments are necessary and I ask the noble Baroness not to move them.
I hear what the Minister said about the workplace algorithmic assessment. However, if the Government believe it is right to have something like an algorithmic recording standard in the public sector, why is it not appropriate to have something equivalent in the private sector?
I would not say it is not right, but if we want to make the ATRS a standard, we should make it a standard in the public sector first and then allow it to be adopted as a means for all private organisations using ADM and AI to meet the transparency principles that they are required to adopt.
So would the Minister not be averse to it? It is merely so that the public sector is ahead of the game, allowing it to show the way and then there may be a little bit of regulation for the private sector.
I am not philosophically averse to such regulation. As to implementing it in the immediate future, however, I have my doubts about that possibility.
My Lords, this has been an interesting and challenging session. I hope that we have given the Minister and his team plenty to think about—I am sure we have. A lot of questions remain unanswered, and although the Committee Room is not full this afternoon, I am sure that colleagues reading the debate will be studying the responses that we have received very carefully.
I am grateful to the noble Baroness, Lady Kidron, for her persuasive support. I am also grateful to the noble Lord, Lord Clement-Jones, for his support for our amendments. It is a shame the noble Lord, Lord Holmes, was not here this afternoon, but I am sure we will hear persuasively from him on his amendment later in Committee.
The Minister is to be congratulated for his consistency. I think I heard the phrase “not needed” or “not necessary” pretty constantly this afternoon, but particularly with this group of amendments. He probably topped the lot with his response on the Equality Act on Amendment 41.
I want to go away with my colleagues to study the responses to the amendments very carefully. That being said, however, I am happy to withdraw Amendment 41 at this stage.
(8 months, 3 weeks ago)
Lords ChamberI join my thanks to those of others to my noble friend Lord Holmes for bringing forward this Bill. I thank all noble Lords who have taken part in this absolutely fascinating debate of the highest standard. We have covered a wide range of topics today. I will do my best to respond, hopefully directly, to as many points as possible, given the time available.
The Government recognise the intent of the Bill and the differing views on how we should go about regulating artificial intelligence. For reasons I will now set out, the Government would like to express reservations about my noble friend’s Bill.
First, with the publication of our AI White Paper in March 2023, we set out proposals for a regulatory framework that is proportionate, adaptable and pro-innovation. Rather than designing a new regulatory system from scratch, the White Paper proposed five cross-sectoral principles, which include safety, transparency and fairness, for our existing regulators to apply within their remits. The principles-based approach will enable regulators to keep pace with the rapid technological change of AI.
The strength of this approach is that regulators can act now on AI within their own remits. This common-sense, pragmatic approach has won endorsement from leading voices across civil society, academia and business, as well as many of the companies right at the cutting edge of frontier AI development. Last month we published an update through the Government’s response to the consultation on the AI White Paper. The White Paper response outlines a range of measures to support existing regulators to deliver against the AI regulatory framework. This includes providing further support to regulators to deliver the regulatory framework through a boost of more than £100 million to upskill regulators and help unlock new AI research and innovation.
As part of this, we announced a £10 million package to jump-start regulators’ AI capabilities, preparing and upskilling regulators to address the risks and to harness the opportunities of this defining technology. It also includes publishing new guidance to support the coherent implementation of the principles. To ensure robust implementation of the framework, we will continue our work to establish the central function.
Let me reassure noble Lords that the Government take mitigating AI risks extremely seriously. That is why several aspects of the central function have already been established, such as the central AI risk function, which will shortly be consulting on its cross-economy AI risk register. Let me reassure the noble Lord, Lord Empey, that the AI risk function will maintain a holistic view of risks across the AI ecosystem, including misuse risks, such as where AI capabilities may be leveraged to undermine cybersecurity.
Specifically on criminality, the Government recognise that the use of AI in criminal activity is a very important issue. We are working with a range of stakeholders, including regulators, and a range of legal experts to explore ways in which liability, including criminal liability, is currently allocated through the AI value chain.
In the coming months we will set up a new steering committee, which will support and guide the activities of a formal regulator co-ordination structure within government. We also wrote to key regulators, requesting that they publish their AI plans by 30 April, setting out how they are considering, preparing for and addressing AI risks and opportunities in their domain.
As for the next steps for ongoing policy development, we are developing our thinking on the regulation of highly capable general-purpose models. Our White Paper consultation response sets out key policy questions related to possible future binding measures, which we are exploring with experts and our international partners. We plan to publish findings from this expert engagement and an update on our thinking later this year.
We also confirmed in the White Paper response that we believe legislative action will be required in every country once the understanding of risks from the most capable AI systems has matured. However, legislating too soon could easily result in measures that are ineffective against the risks, are disproportionate or quickly become out of date.
Finally, we make clear that our approach is adaptable and iterative. We will continue to work collaboratively with the US, the EU and others across the international landscape to both influence and learn from international development.
I turn to key proposals in the Bill that the noble Lord has tabled. On the proposal to establish a new AI authority, it is crucial that we put in place agile and effective mechanisms that will support the coherent and consistent implementation of the AI regulatory framework and principles. We believe that a non-statutory central function is the most appropriate and proportionate mechanism for delivering this at present, as we observe a period of non-statutory implementation across our regulators and conduct our review of regulator powers and remits.
In the longer term, we recognise that there may be a case for reviewing how and where the central function has delivered, once its functions have become more clearly defined and established, including whether the function is housed within central government or in a different form. However, the Government feel that this would not be appropriate for the first stage of implementation. To that end, as I mentioned earlier, we are delivering the central function within DSIT, to bring coherence to the regulatory framework. The work of the central function will provide clarity and ensure that the framework is working as intended and that joined-up and proportionate action can be taken if there are gaps in our approach.
We recognise the need to assess the existing powers and remits of the UK’s regulators to ensure they are equipped to address AI risks and opportunities in their domains and to implement the principles consistently and comprehensively. We anticipate having to introduce a statutory duty on regulators requiring them to have due regard to the principles after an initial period of non-statutory implementation. For now, however, we want to test and iterate our approach. We believe this approach offers critical adaptability, but we will keep it under review; for example, by assessing the updates on strategic approaches to AI that several key regulators will publish by the end of April. We will also work with government departments and regulators to analyse and review potential gaps in existing regulatory powers and remits.
Like many noble Lords, we see approaches such as regulatory sandboxes as a crucial way of helping businesses navigate the AI regulatory landscape. That is why we have funded the four regulators in the Digital Regulation Cooperation Forum to pilot a new, multiagency advisory service known as the AI and digital hub. We expect the hub to launch in mid-May and will provide further details in the coming weeks on when this service will be open for applications from innovators.
One of the principles at the heart of the AI regulatory framework is accountability and governance. We said in the White Paper that a key part of implementation of this principle is to ensure effective oversight of the design and use of AI systems. We have recognised that additional binding measures may be required for developers of the most capable AI systems and that such measures could include requirements related to accountability. However, it would be too soon to mandate measures such as AI-responsible officers, even for these most capable systems, until we understand more about the risks and the effectiveness of potential mitigations. This could quickly become burdensome in a way that is disproportionate to risk for most uses of AI.
Let me reassure my noble friend Lord Holmes that we continue to work across government to ensure that we are ready to respond to the risks to democracy posed by deep fakes; for example, through the Defending Democracy Taskforce, as well as through existing criminal offences that protect our democratic processes. However, we should remember that AI labelling and identification technology is still at an early stage. No specific technology has yet been proven to be both technically and organisationally feasible at scale. It would not be right to mandate labelling in law until the potential benefits and risks are better understood.
Noble Lords raised the importance of protecting intellectual property, a profoundly important subject. In the AI White Paper consultation response, the Government committed to provide an update on their approach to AI and copyright issues soon. I am confident that, when we do so, it will address many of the issues that noble Lords have raised today.
In summary, our approach, combining a principles-based framework, international leadership and voluntary measures on developers, is right for today, as it allows us to keep pace with rapid and uncertain advances in AI. The UK has successfully positioned itself as a global leader on AI, in recognition of the fact that AI knows no borders and that its complexity demands nuanced international governance. In addition to spearheading thought leadership through the AI Safety Summit, the UK has supported effective action through the G7, the Council of Europe, the OECD, the G5, the G20 and the UN, among other bodies. We look forward to continuing to engage with all noble Lords on these critical issues as we continue to develop our regulatory approach.
(8 months, 4 weeks ago)
Grand CommitteeAs I was saying, it is important for the framework on data protection that we take a precautionary approach. I hope that the Minister will this afternoon be able to provide a plain English explanation of the changes, as well as giving us an assurance that those changes to definitions do not result in watering down the current legislation.
We broadly support Amendments 1 and 5 and the clause stand part notice, in the sense that they provide additional probing of the Government’s intentions in this area. We can see that the noble Lord, Lord Clement-Jones, is trying with Amendment 1 to bring some much-needed clarity to the anonymisation issue and, with Amendment 5, to secure that data remains personal data in any event. I suspect that the Minister will tell us this afternoon that that is already the case, but a significant number of commentators have questioned this, since the definition of “personal data” is seemingly moving away from the EU GDPR standard towards a definition that is more subjective from the perspective of the controller, processor or recipient. We must be confident that the new definition does not narrow the circumstances in which the information is protected as personal data. That will be an important standard for this Committee to understand.
Amendment 288, tabled by the noble Lord, Lord Clement- Jones, seeks a review and an impact assessment of the anonymisation and identifiability of data subjects. Examining that in the light of the EU GDPR seems to us to be a useful and novel way of making a judgment over which regime better suits and serves data subjects.
We will listen with interest to the Minister’s response. We want to be more than reassured that the previous high standards and fundamental principles of data protection will not be undermined and compromised.
I thank all noble Lords who have spoken in this brief, interrupted but none the less interesting opening debate. I will speak to the amendments tabled by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones; I note that I plan to that form of words quite a lot in the next eight sessions on this Bill. I thank them for tabling these amendments so that we can debate what are, in the Government’s view, the significant benefits of Clause 1.
In response to the points from the noble Lord, Lord Clement-Jones, on the appetite for the reforms in the Bill, we take very seriously the criticisms of the parties that he mentioned—the civil society groups—but it is important to note that, when the Government consulted on these reforms, we received almost 3,000 responses. At that time, we proposed to clarify when data would be regarded as anonymous and proposed legislating to confirm that the test for whether anonymous data can be reidentified is relative to the means available to the controller to reidentify the data. The majority of respondents agreed that greater clarity in legislation would indeed be beneficial.
As noble Lords will know, the UK’s data protection legislation applies only to personal data, which is data relating to an identified or identifiable living individual. It does not apply to non-personal, anonymous data. This is important because, if organisations can be sure that the data they are handling is anonymous, they may be able to more confidently put it to good use in important activities such as research and product development. The current data protection legislation is already clear that a person can be identified in a number of ways by reference to details such as names, identification numbers, location data and online identifiers, or via information about a person’s physical, genetic, mental, economic or cultural characteristics. The Bill does not change the existing legislation in this respect.
With regard to genetic information, which was raised by my noble friend Lord Kamall and the noble Lord, Lord Davies, any information that includes enough genetic markers to be unique to an individual is personal data and special category genetic data, even if names and other identifiers have been removed. This means that it is subject to the additional protections set out in Article 9 of the UK GDPR. The Bill does not change this position.
However, the existing legislation is unclear about the specific factors that a data controller must consider when assessing whether any of this information relates to an identifiable living person. This uncertainty is leading to inconsistent application of anonymisation and to anonymous data being treated as personal data out of an abundance of caution. This, in turn, reduces the opportunities for anonymous data to be used effectively for projects in the public interest. It is this difficulty that Clause 1 seeks to address by providing a comprehensive statutory test on identifiability. The test will require data controllers and processors to consider the likelihood of people within or outside their organisations reidentifying individuals using reasonable means. It is drawn from recital 26 of the EU GDPR and should therefore not be completely unfamiliar to most organisations.
I turn now to the specific amendments that have been tabled in relation to this clause. Amendment 1 in the name of the noble Lord, Lord Clement-Jones, would reiterate the position currently set out in the UK GDPR and its recitals: where individuals can be identified without the use of additional information because data controllers fail to put in place appropriate organisational measures, such as technical or contractual safeguards prohibiting reidentification, they would be considered directly identifiable. Technical and organisational measures put in place by organisations are factors that should be considered alongside others under new Section 3A of the Data Protection Act when assessing whether an individual is identifiable from the data being processed. Clause 1 sets out the threshold at which data—and, therefore, personal data—is identifiable and clarifies when data is anonymous.
On the technical capabilities of a respective data controller, these are already relevant factors under current law and ICO guidance in determining whether data is personal. This means that the test of identifiability is already a relative one today in respect of the data controller, the data concerned and the purpose of the processing. However, the intention of the data controller is not a relevant factor under current law, and nor does Clause 1 make it a factor. Clause 1 merely clarifies the position under existing law and follows very closely the wording of recital 26. Let me state this clearly: nothing in Clause 1 introduces the subjective intention of the data controller as a relevant factor in determining identifiability, and the position will remain the same as under the current law and as set out in ICO guidance.
In response to the points made by the noble Lord, Lord Clement-Jones, and others on pseudonymised personal data, noble Lords may be aware that the definition of personal data in Article 4(1) of the UK GDPR, when read in conjunction with the definition of pseudonymisation in Article 4(5), makes it clear that pseudonymised data is personal data, not anonymous data, and is thus covered by the UK’s data protection regime. I hope noble Lords are reassured by that. I also hope that, for the time being, the noble Lord, Lord Clement-Jones, will agree to withdraw his amendment and not press the related Amendment 5, which seeks to make it clear that pseudonymised data is personal data.
Amendment 4 would require the Secretary of State to assess the difference in meaning and scope between the current statutory definition of personal data and the new statutory definition that the Bill will introduce two months after its passing. Similarly, Amendment 288 seeks to review the impact of Clause 1 six months after the enactment of the Bill. The Government feel that neither of these amendments is necessary as the clause is drawn from recital 26 of the EU GDPR and case law and, as I have already set out, is not seeking to substantially change the definition of personal data. Rather, it is seeking to provide clarity in legislation.
I follow the argument, but what we are suggesting in our amendment is some sort of impact assessment for the scheme, including how it currently operates and how the Government wish it to operate under the new legislation. Have the Government undertaken a desktop exercise or any sort of review of how the two pieces of legislation might operate? Has any assessment of that been made? If they have done so, what have they found?
Obviously, the Bill has been in preparation for some time. I completely understand the point, which is about how we can be so confident in these claims. I suggest that I work with the Bill team to get an answer to that question and write to Members of the Committee, because it is a perfectly fair question to ask what makes us so sure.
In the future tense, I can assure noble Lords that the Department for Science, Innovation and Technology will monitor and evaluate the impact of this Bill as a whole in the years to come, in line with cross-government evaluation guidance and through continued engagement with stakeholders.
The Government feel that the first limb of Amendment 5 is not necessary given that, as has been noted, pseudonymised data is already considered personal data under this Bill. In relation to the second limb of the amendment, if the data being processed is actually personal data, the ICO already has powers to require organisations to address non-compliance. These include requiring it to apply appropriate protections to personal data that it is processing, and are backed up by robust enforcement mechanisms.
That said, it would not be appropriate for the processing of data that was correctly assessed as anonymous at the time of processing to retrospectively be treated as processing of personal data and subject to data protection laws, simply because it became personal data at a later point in the processing due to a change in circumstances. That would make it extremely difficult for any organisation to treat any dataset as anonymous and would undermine the aim of the clause, significantly reducing the potential to use anonymous data for important research and development activities.
My Lords, we on the Labour Benches have become co-signatories to the amendments tabled by the noble Baroness, Lady Kidron, and supported by the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding. The noble Baroness set out very clearly and expertly the overarching purpose of retaining the level of protection currently afforded by the Data Protection Act 2018. Amendments 2 and 3 specifically stipulate that, where data controllers know, or should reasonably know, that a user is a child, they should be given the data protection codified in that Act. Amendment 9 takes it a stage further and includes children’s data in the definition of sensitive personal data, and gives it the benefit of being treated to a heightened level of protection—quite rightly, too. Finally, Amendment 290—the favourite of the noble Lord, Lord Clement-Jones—attempts to hold Ministers to the commitment made by Paul Scully in the Commons to maintain existing standards of data protection carried over from that 2018 Act.
Why is all this necessary? I suspect that the Minister will argue that it is not needed because Clause 5 already provides for the Secretary of State to consider the impact of any changes to the rights and freedoms of individuals and, in particular, of children, who require special protection.
We disagree with that argument. In the interests of brevity and the spirit of the recent Procedure Committee report, which says that we should not repeat each other’s arguments, I do not intend to speak at length, but we have a principal concern: to try to understand why the Government want to depart from the standards of protection set out in the age-appropriate design code—the international gold standard—which they so enthusiastically signed up to just five or six years ago. Given the rising levels of parental concern over harmful online content and well-known cases highlighting the harms that can flow from unregulated material, why do the Government consider it safe to water down the regulatory standards at this precise moment in time? The noble Baroness, Lady Kidron, valuably highlighted the impact of the current regulatory framework on companies’ behaviour. That is exactly what legislation is designed to do: to change how we look at things and how we work. Why change that? As she has argued very persuasively, it is and has been hugely transformative. Why throw away that benefit now?
My attention was drawn to one example of what can happen by a briefing note from the 5Rights Foundation. As it argued, children are uniquely vulnerable to harm and risk online. I thought its set of statistics was really interesting. By the age of 13, 72 million data points have already been collected about children. They are often not used in children’s best interests; for example, the data is often used to feed recommender systems and algorithms designed to keep attention at all costs and have been found to push harmful content at children.
When this happens repeatedly over time, it can have catastrophic consequences, as we know. The coroner in the Molly Russell inquest found that she had been recommended a stream of depressive content by algorithms, leading the coroner to rule that she
“died from an act of self-harm whilst suffering from depression and the negative effects of online content”.
We do not want more Molly Russell cases. Progress has already been made in this field; we should consider dispensing with it at our peril. Can the Minister explain today the thinking and logic behind the changes that the Government have brought forward? Can he estimate the impact that the new lighter-touch regime, as we see it, will have on child protection? Have the Government consulted extensively with those in the sector who are properly concerned about child protection issues, and what sort of responses have the Government received?
Finally, why have the Government decided to take a risk with the sound framework that was already in place and built on during the course of the Online Safety Act? We need to hear very clearly from the Minister how they intend to engage with groups that are concerned about these child protection issues, given the apparent loosening of the current framework. The noble Baroness, Lady Harding, said that this is hard-fought ground; we intend to continue making it so because these protections are of great value to our society.
I am grateful to the noble Baroness, Lady Kidron, for her Amendments 2, 3, 9 and 290 and to all noble Lords who have spoken, as ever, so clearly on these points.
All these amendments seek to add protections for children to various provisions in the Bill. I absolutely recognise the intent behind them; indeed, let me take this opportunity to say that the Government take child safety deeply seriously and agree with the noble Baroness that all organisations must take great care, both when making decisions about the use of children’s data and throughout the duration of their processing activities. That said, I respectfully submit that these amendments are not necessary for three main reasons; I will talk in more general terms before I come to the specifics of the amendments.
First, the Bill maintains a high standard of data protection for everybody in the UK, including—of course—children. The Government are not removing any of the existing data protection principles in relation to lawfulness, fairness, transparency, purpose limitation, data minimisation, storage limitation, accuracy, data security or accountability; nor are they removing the provisions in the UK GDPR that require organisations to build privacy into the design and development of new processing activities.
The existing legislation acknowledges that children require specific protection for their personal data, as they may be less aware of the risks, consequences and safeguards concerned, and of their rights in relation to the processing of personal data. Organisations will need to make sure that they continue to comply with the data protection principles on children’s data and follow the ICO’s guidance on children and the UK GDPR, following the changes we make in the Bill. Organisations that provide internet services likely to be accessed by children will need to continue to comply with their transparency and fairness obligations and the ICO’s age-appropriate design code. The Government welcome the AADC, as Minister Scully said, and remain fully committed to the high standards of protection that it sets out for children.
Secondly, some of the provisions in the Bill have been designed specifically with the rights and safety of children in mind. For example, one reason that the Government introduced the new lawful ground of recognised legitimate interest in Clause 5, which we will debate later, was that some consultation respondents said that the current legislation can deter organisations, particularly in the voluntary sector, from sharing information that might help to prevent crime or protect children from harm. The same goes for the list of exemptions to the purpose limitation principle introduced by Clause 6.
There could be many instances where personal data collected for one purpose may have to be reused to protect children from crime or safeguarding risks. The Bill will provide greater clarity around this and has been welcomed by stakeholders, including in the voluntary sector.
While some provisions in the Bill do not specifically mention children or children’s rights, data controllers will still need to carefully consider the impact of their processing activities on children. For example, the new obligations on risk assessments, record keeping and the designation of senior responsible individuals will apply whenever an organisation’s processing activities are likely to result in high risks to people, including children.
Thirdly, the changes we are making in the Bill must be viewed in a wider context. Taken together, the UK GDPR, the Data Protection Act 2018 and the Online Safety Act 2023 provide a comprehensive legal framework for keeping children safe online. Although the data protection legislation and the age-appropriate design code make it clear how personal data can be processed, the Online Safety Act makes clear that companies must take steps to make their platforms safe by design. It requires social media companies to protect children from illegal, harmful and age-inappropriate content, to ensure they are more transparent about the risks and dangers posed to children on their sites, and to provide parents and children with clear and accessible ways to report problems online when they do arise.
After those general remarks, I turn to the specific amendments. The noble Baroness’s Amendments 2 and 3 would amend Clause 1 of the Bill, which relates to the test for assessing whether data is personal or anonymous. Her explanatory statement suggests that these amendments are aimed at placing a duty on organisations to determine whether the data they are processing relates to children, thereby creating a system of age verification. However, requiring data controllers to carry out widespread age verification of data subjects could create its own data protection and privacy risks, as it would require them to retain additional personal information such as dates of birth.
The test we have set out for reidentification is intended to apply to adults and children alike. If any person is likely to be identified from the data using reasonable means, the data protection legislation will apply. Introducing one test for adults and one for children is unlikely to be workable in practice and fundamentally undermines the clarity that this clause seeks to bring to organisations. Whether a person is identifiable will depend on a number of objective factors, such as the resources and technology available to organisations, regardless of whether they are an adult or a child. Creating wholly separate tests for adults and children, as set out in the amendment, would add unnecessary complexity to the clause and potentially lead to confusion.
As I understand it, the basis on which we currently operate is that children get a heightened level of protection. Is the Minister saying that that is now unnecessary and is captured by the way in which the legislation has been reframed?
I am saying, specifically on Clause 1, that separating the identifiability of children and the identifiability of adults would be detrimental to both but particularly, in this instance, to children.
Amendment 9 would ensure that children’s data is included in the definition of special category data and is subject to the heightened protections afforded to this category of data by Article 9 of the UK GDPR. This could have unintended consequences, because the legal position would be that processing of children’s data would be banned unless specifically permitted. This could create the need for considerable additional legislation to exempt routine and important processing from the ban; for example, banning a Girl Guides group from keeping a list of members unless specifically exempted would be disproportionate. However, more sensitive data such as records relating to children’s health or safeguarding concerns would already be subject to heightened protections in the UK GDPR, as soon as the latter type of data is processed.
I am grateful to the noble Baroness, Lady Kidron, for raising these issues and for the chance to set out why the Government feel that children’s protection is at least maintained, if not enhanced. I hope my answers have, for the time being, persuaded her of the Government’s view that the Bill does not reduce standards of protection for children’s data. On that basis, I ask her also not to move her Amendment 290 on the grounds that a further overarching statement on this is unnecessary and may cause confusion when interpreting the legislation. For all the reasons stated above, I hope that she will now reconsider whether her amendments in this group are necessary and agree not to press them.
Can I press the Minister more on Amendment 290 from the noble Baroness, Lady Kidron? All it does is seek to maintain the existing standards of data protection for children, as carried over from the 2018 Act. If that is all it does, what is the problem with that proposed new clause? In its current formulation, does it not put the intention of the legislation in a place of certainty? I do not quite get why it would be damaging.
I believe it restates what the Government feel is clearly implied or stated throughout the Bill: that children’s safety is paramount. Therefore, putting it there is either duplicative or confusing; it reduces the clarity of the Bill. In no way is this to say that children are not protected—far from it. The Government feel it would diminish the clarity and overall cohesiveness of the Bill to include it.
My Lords, not to put too fine a point on it, the Minister is saying that nothing in the Bill diminishes children’s rights, whether in Clause 1, Clause 6 or the legitimate interest in Clause 5. He is saying that absolutely nothing in the Bill diminishes children’s rights in any way. Is that his position?
Can I add to that question? Is my noble friend the Minister also saying that there is no risk of companies misinterpreting the Bill’s intentions and assuming that this might be some form of diminution of the protections for children?
In answer to both questions, what I am saying is that, first, any risk of misinterpreting the Bill with respect to children’s safety is diminished, rather than increased, by the Bill. Overall, it is the Government’s belief and intention that the Bill in no way diminishes the safety or privacy of children online. Needless to say, if over the course of our deliberations the Committee identifies areas of the Bill where that is not the case, we will absolutely be open to listening on that, but let me state this clearly: the intent is to at least maintain, if not enhance, the safety and privacy of children and their data.
My Lords, that creates another question, does it not? If that is the case, why amend the original wording from the 2018 Act?
Sorry, the 2018 Act? Or is the noble Lord referring to the amendments?
Why change the wording that provides the protection that is there currently?
Okay. The Government feel that, in terms of the efficient and effective drafting of the Bill, that paragraph diminishes the clarity by being duplicative rather than adding to it by making a declaration. For the same reason, we have chosen not to make a series of declarations about other intentions of the Bill overall in the belief that the Bill’s intent and outcome are protected without such a statement.
My Lords, before our break, the noble Baroness, Lady Harding, said that this is hard-fought ground; I hope the Minister understands from the number of questions he has just received during his response that it will continue to be hard-fought ground.
I really regret having to say this at such an early stage on the Bill, but I think that some of what the Minister said was quite disingenuous. We will get to it in other parts of the Bill, but the thing that we have all agreed to disagree on at this point is the statement that the Bill maintains data privacy for everyone in the UK. That is a point of contention between noble Lords and the Minister. I absolutely accept and understand that we will come to a collective view on it in Committee. However, the Minister appeared to suggest—I ask him to correct me if I have got this wrong—that the changes on legitimate interest and purpose limitation are child safety measures because some people are saying that they are deterred from sharing data for child protection reasons. I have to tell him that they are not couched or formed like that; they are general-purpose shifts. There is absolutely no question but that the Government could have made specific changes for child protection, put them in the Bill and made them absolutely clear. I find that very worrying.
I also find it worrying, I am afraid—this is perhaps where we are heading and the thing that many organisations are worried about—that bundling the AADC in with the Online Safety Act and saying, “I’ve got it over here so you don’t need it over there” is not the same as maintaining the protections for children from a high level of data. It is not the same set of things. I specifically said that this was not an age-verification measure and would not require it; whatever response there was on that was therefore unnecessary because I made that quite clear in my remarks. The Committee can understand that, in order to set a high bar of data protection, you must either identify a child or give it to everyone. Those are your choices. You do not have to verify.
I will withdraw the amendment, but I must say that the Government may not have it both ways. The Bill cannot be different or necessary and at the same time do nothing. The piece that I want to leave with the Committee is that it is the underlying provisions that allow the ICO to take action on the age-appropriate design code. It does not matter what is in the code; if the underlying provisions change, so does the code. During Committee, I expect that there will be a report on the changes that have happened all around the world as a result of the code, and we will be able to measure whether the new Bill would be able to create those same changes. With that, I beg leave to withdraw my amendment.
My Lords, I am grateful to all noble Lords who have spoken on this group. Amendment 6 to Clause 2, tabled by the noble Lord, Lord Clement-Jones, rightly tests the boundaries on the use of personal data for scientific research and, as he says, begins to ask, “What is the real purpose of this clause? Is it the clarification of existing good practice or is it something new? Do we fully understand what that new proposition is?”
As he said, there is particular public concern about the use of personal health data where it seems that some private companies are stretching the interpretation of “the public good”, for which authorisation for the use of this data was initially freely given, to something much wider. Although the clause seeks to provide some reassurance on this, we question whether it goes far enough and whether there are sufficient protections against the misuse of personal health data in the way the clause is worded.
This raises the question of whether it is only public health research that needs to be in the public interest, which is the way the clause is worded at the moment, because it could equally apply to research using personal data from other public services, such as measuring educational outcomes or accessing social housing. There is a range of uses for personal data. In an earlier debate, we heard about the plethora of data already held on people, much of which individuals do not understand or know about and which could be used for research or to make judgments about them. So we need to be sensitive about the way this might be used. It would be helpful to hear from the Minister why public health research has been singled out for special attention when, arguably, it should be a wider right across the board.
Noble Lords have asked questions about the wider concerns around Clause 2, which could enable private companies to use personal data to develop new products for commercial benefit without needing to inform the data subjects. As noble Lords have said, this is not what people would normally expect to be described as “scientific research”. The noble Baroness, Lady Kidron, was quite right that it has the potential to be unethical, so we need some standards and some clear understanding of what we mean by “scientific research”.
That is particularly important for Amendments 7 and 132 to 134 in the name of the noble Lord, Lord Clement-Jones, which underline the need for data subjects to be empowered and given the opportunity to object to their data being used for a new purpose. Arguably, without these extra guarantees—particularly because there is a lack of trust about how a lot of this information is being used—data subjects will be increasingly reluctant to hand over personal data on a voluntary basis in the first place. It may well be that this is an area where the Information Commissioner needs to provide additional advice and guidance to ensure that we can reap the benefits of good-quality scientific research that is in the public interest and in which the citizens involved can have absolute trust. Noble Lords around the Room have stressed that point.
Finally, we have added our names to the amendments tabled by the noble Baroness, Lady Kidron, on the use of children’s data for scientific research. As she rightly points out, the 2018 Act gave children a higher standard of protection on the uses for which their data is collected and processed. It is vital that this Bill, for all its intents to simplify and water down preceding rights, does not accidentally put at risk the higher protection agreed for children. In the earlier debate, the Minister said that he believed it will not do so. I am not sure that “believe” is a strong enough word here; we need guarantees that go beyond that. I think that this is an issue we will come back to again and again in terms of what is in the Bill and what guarantees exist for that protection.
In particular, there is a concern that relaxing the legal basis on which personal data can be processed for scientific research, including privately funded research carried out by commercial entities, could open the door for children’s data to be exploited for commercial purposes. We will consider the use of children’s data collected in schools in our debate on a separate group but we clearly need to ensure that the handling of pupils’ data by the Department for Education and the use of educational apps by private companies do not lead to a generation of exploited children who are vulnerable to direct marketing and manipulative messaging. The noble Baroness’s amendments are really important in this regard.
I also think that the noble Baroness’s Amendment 145 is a useful initiative to establish a code of practice on children’s data and scientific research. It would give us an opportunity to balance the best advantages of children’s research, which is clearly in the public and personal interest, with the maintenance of the highest level of protection from exploitation.
I hope that the Minister can see the sense in these amendments. In particular, I hope that he will take forward the noble Baroness’s proposals and agree to work with us on the code of practice principles and to put something like that in the Bill. I look forward to his response.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for this series of amendments.
I will first address Amendment 6, which seeks to amend Clause 2. As the noble Lord said, the definitions created by Clause 2, including “scientific research purposes”, are based on the current wording in recital 159 to the UK GDPR. We are changing not the scope of these definitions but their legal status. This amendment would require individual researchers to assess whether their research should be considered to be in the public interest, which could create uncertainty in the sector and discourage research. This would be more restrictive than the current position and would undermine the Government’s objectives to facilitate scientific research and empower researchers.
We have maintained a flexible scope as to what is covered by “scientific research” while ensuring that the definition is still sufficiently narrow in that it can cover only what would reasonably be seen as scientific research. This is because the legislation needs to be able to adapt to the emergence of new areas of innovative research. Therefore, the Government feel that it is more appropriate for the regulator to add more nuance and context to the definition. This includes the types of processing that are considered—
I am sorry to interrupt but it may give the Box a chance to give the Minister a note on this. Is the Minister saying that recital 159 includes the word “commercial”?
I am afraid I do not have an eidetic memory of recital 159, but I would be happy to—
That is precisely why I ask this question in the middle of the Minister’s speech to give the Box a chance to respond, I hope.
Researchers must also comply with the required safeguards to protect individuals’ privacy. All organisations conducting scientific research, including those with commercial interests, must also meet all the safeguards for research laid out in the UK GDPR and comply with the legislation’s core principles, such as fairness and transparency. Clause 26 sets out several safeguards that research organisations must comply with when processing personal data for research purposes. The ICO will update its non-statutory guidance to reflect many of the changes introduced by this Bill.
Scientific research currently holds a privileged place in the data protection framework because, by its nature, it is already viewed as generally being in the public interest. As has been observed, the Bill already applies a public interest test to processing for the purpose of public health studies in order to provide greater assurance for research that is particularly sensitive. Again, this reflects recital 159.
In response to the noble Baroness, Lady Jones, on why public health research is being singled out, as she stated, this part of the legislation just adds an additional safeguard to studies into public health ensuring that they must be in the public interest. This does not limit the scope for other research unrelated to public health. Studies in the area of public health will usually be in the public interest. For the rare, exceptional times that a study is not, this requirement provides an additional safeguard to help prevent misuse of the various exemptions and privileges for researchers in the UK GDPR. “Public interest” is not defined in the legislation, so the controller needs to make a case-by-case assessment based on its purposes.
On the point made by the noble Lord, Lord Clement-Jones, about recitals and ICO guidance, although we of course respect and welcome ICO guidance, it does not have legislative effect and does not provide the certainty that legislation does. That is why we have done so via this Bill.
Amendment 7 to Clause 3 would undermine the broader consent concept for scientific research. Clause 3 places the existing concept of “broad consent” currently found in recital 33 to the UK GDPR on a statutory footing with the intention of improving awareness and confidence for researchers. This clause applies only to scientific research processing that is reliant on consent. It already contains various safeguards. For example, broad consent can be used only where it is not possible to identify at the outset the full purposes for which personal data might be processed. Additionally, to give individuals greater agency, where possible individuals will have the option to consent to only part of the processing and can withdraw their consent at any time.
Clause 3 clarifies an existing concept of broad consent which outlines how the conditions for consent will be met in certain circumstances when processing for scientific research purposes. This will enable consent to be obtained for an area of scientific research when researchers cannot at the outset identify fully the purposes for which they are collecting the data. For example, the initial aim may be the study of cancer, but it later becomes the study of a particular cancer type.
Furthermore, as part of the reforms around the reuse of personal data, we have further clarified that when personal data is originally collected on the basis of consent, a controller would need to get fresh consent to reuse that data for a new purpose unless a public interest exemption applied and it is unreasonable to expect the controller to obtain that consent. A controller cannot generally reuse personal data originally collected on the basis of consent for research purposes.
Turning to Amendments 132 and 133 to Clause 26, the general rule described in Article 13(3) of the UK GDPR is that controllers must inform data subjects about a change of purposes, which provides an opportunity to withdraw consent or object to the proposed processing where relevant. There are existing exceptions to the right to object, such as Article 21(6) of the UK GDPR, where processing is necessary for research in the public interest, and in Schedule 2 to the Data Protection Act 2018, when applying the right would prevent or seriously impair the research. Removing these exemptions could undermine life-saving research and compromise long-term studies so that they are not able to continue.
Regarding Amendment 134, new Article 84B of the UK GDPR already sets out the requirement that personal data should be anonymised for research, archiving and statistical—RAS—purposes unless doing so would mean the research could not be carried through. Anonymisation is not always possible as personal data can be at the heart of valuable research, archiving and statistical activities, for example, in genetic research for the monitoring of new treatments of diseases. That is why new Article 84C of the UK GDPR also sets out protective measures for personal data that is used for RAS purposes, such as ensuring respect for the principle of data minimisation through pseudonymisation.
The stand part notice in this group seeks to remove Clause 6 and, consequentially, Schedule 2. In the Government’s consultation on data reform, Data: A New Direction, we heard that the current provisions in the UK GDPR on personal data reuse are difficult for controllers and individuals to navigate. This has led to uncertainty about when controllers can reuse personal data, causing delays for researchers and obstructing innovation. Clause 6 and Schedule 2 address the existing uncertainty around reusing personal data by setting out clearly the conditions in which the reuse of personal data for a new purpose is permitted. Clause 6 and Schedule 2 must therefore remain to give controllers legal certainty and individuals greater transparency.
Amendment 22 seeks to remove the power to add to or vary the conditions set out in Schedule 2. These conditions currently constitute a list of specific public interest purposes, such as safeguarding vulnerable individuals, for which an organisation is permitted to reuse data without needing consent or to identify a specific law elsewhere in legislation. Since this list is strictly limited and exhaustive, a power is needed to ensure that it is kept up to date with future developments in how personal data is used for important public interest purposes.
I am interested that the safeguarding requirement is already in the Bill, so, in terms of children, which I believe the Minister is going to come to, the onward processing is not a question of safeguarding. Is that correct? As the Minister has just indicated, that is already a provision.
Just before we broke, I was on the verge of attempting to answer the question from the noble Baroness, Lady Kidron; I hope my coming words will do that, but she can intervene again if she needs to.
I turn to the amendments that concern the use of children’s data in research and reuse. Amendment 8 would also amend Clause 3; the noble Baroness suggests that the measure should not apply to children’s data, but this would potentially prevent children, or their parents or guardians, from agreeing to participate in broad areas of pioneering research that could have a positive impact on children, such as on the causes of childhood diseases.
On the point about safeguarding, the provisions on recognised legitimate interests and further processing are required for safeguarding children for compliance with, respectively, the lawfulness and purpose limitation principles. The purpose limitation provision in this clause is meant for situations where the original processing purpose was not safeguarding and the controller then realises that there is a need to further process it for safeguarding.
Research organisations are already required to comply with the data protection principles, including on fairness and transparency, so that research participants can make informed decisions about how their data is used; and, where consent is the lawful basis for processing, children, or their parents or guardians, are free to choose not to provide their consent, or, if they do consent, they can withdraw it at any time. In addition, the further safeguards that are set out in Clause 26, which I mentioned earlier, will protect all personal data, whether it relates to children or adults.
Amendment 21 would require data controllers to have specific regard to the fact that children’s data requires a higher standard of protection for children when deciding whether reuse of their data is compatible with the original purpose for which it was collected. This is unnecessary because the situations in which personal data could be reused are limited to public interest purposes designed largely to protect the public and children, in so far as they are relevant to them. Controllers must also consider the possible consequences for data subjects and the relationship between the controller and the data subject. This includes taking into account that the data subject is a child, in addition to the need to generally consider the interests of children.
Amendment 23 seeks to limit use of the purpose limitation exemptions in Schedule 2 in relation to children’s data. This amendment is unnecessary because these provisions permit further processing only in a narrow range of circumstances and can be expanded only to serve important purposes of public interest. Furthermore, it may inadvertently be harmful to children. Current objectives include safeguarding children or vulnerable people, preventing crime or responding to emergencies. In seeking to limit the use of these provisions, there is a risk that the noble Baroness’s amendments might make data controllers more hesitant to reuse or disclose data for public interest purposes and undermine provisions in place to protect children. These amendments could also obstruct important research that could have a demonstrable positive impact on children, such as research into children’s diseases.
Amendment 145 would require the ICO to publish a statutory code on the use of children’s data in scientific research and technology development. Although the Government recognise the value that ICO codes can play in promoting good practice and improving compliance, we do not consider that it would be appropriate to add these provisions to the Bill without further detailed consultation with the ICO and the organisations likely to be affected by the new codes. Clause 33 of the Bill already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that it sees fit, so this is an issue that we could return to in the future if the evidence supports it.
I will read Hansard very carefully, because I am not sure that I absolutely followed the Minister, but we will undoubtedly come back to this. I will ask two questions. Earlier, before we had a break, in response to some of the early amendments in the name of the noble Lord, Lord Clement-Jones, the Minister suggested that several things were being taken out of the recital to give them solidity in the Bill; so I am using this opportunity to suggest that recital 38, which is the special consideration of children’s data, might usefully be treated in a similar way and that we could then have a schedule that is the age-appropriate design code in the Bill. Perhaps I can leave that with the Minister, and perhaps he can undertake to have some further consultation with the ICO on Amendment 145 specifically.
With respect to recital 38, that sounds like a really interesting idea. Yes, let us both have a look and see what the consultation involves and what the timing might look like. I confess to the Committee that I do not know what recital 38 says, off the top of my head. For the reasons I have set out, I am not able to accept these amendments. I hope that noble Lords will therefore not press them.
Returning to the questions by the noble Lord, Lord Clement-Jones, on the contents of recital 159, the current UK GDPR and EU GDPR are silent on the specific definition of scientific research. It does not preclude commercial organisations performing scientific research; indeed, the ICO’s own guidance on research and its interpretation of recital 159 already mention commercial activities. Scientific research can be done by commercial organisations—for example, much of the research done into vaccines, and the research into AI referenced by the noble Baroness, Lady Harding. The recital itself does not mention it but, as the ICO’s guidance is clear on this already, the Government feel that it is appropriate to put this on a statutory footing.
My Lords, that was intriguing. I thank the Minister for his response. It sounds as though, again, guidance would have been absolutely fine, but what is there not to like about the ICO bringing clarity? It was quite interesting that the Minister used the phrase “uncertainty in the sector” on numerous occasions and that is becoming a bit of a mantra as the Bill goes on. We cannot create uncertainty in the sector, so the poor old ICO has been labouring in the vineyard for the last few years to no purpose at all. Clearly there has been uncertainty in the sector of a major description, and all its guidance and all the work that it has put in over the years have been wholly fruitless, really. It is only this Government that have grabbed the agenda with this splendid 300-page data protection Bill that will clarify this for business. I do not know how much they will have to pay to get new compliance officers or whatever it happens to be, but the one thing that the Bill will absolutely not create is greater clarity.
I am a huge fan of making sure that we understand what the recitals have to say, and it is very interesting that the Minister is saying that the recital is silent but the ICO’s guidance is pretty clear on this. I am hugely attracted by the idea of including recital 38 in the Bill. It is another lightbulb moment from the noble Baroness, Lady Kidron, who has these moments, rather like with the age-appropriate design code, which was a huge one.
We are back to the concern, whether in the ICO guidance, the Bill or wherever, that scientific research needs to be in the public interest to qualify and not have all the consents that are normally required for the use of personal data. The Minister said, “Well, of course we think that scientific research is in the public interest; that is its very definition”. So why does only public health research need that public interest test and not the other aspects? Is it because, for instance, the opt-out was a bit of a disaster and 3 million people opted out of allowing their health data to be shared or accessed by GPs? Yes, it probably is.
Do the Government want a similar kind of disaster to happen, in which people get really excited about Meta or other commercial organisations getting hold of their data, a public outcry ensues and they therefore have to introduce a public interest test on that? What is sauce for the goose is sauce for the gander. I do not think that personal data should be treated in a particularly different way in terms of its public interest, just because it is in healthcare. I very much hope that the Minister will consider that.
My Lords, I am also pleased to support these amendments in the name of the noble Baroness, Lady Kidron, to which I have added my name. I am hugely enthusiastic about them, too, and think that this has been a lightbulb moment from the noble Baroness. I very much thank her for doing all of this background work because she has identified the current weakness in the data protection landscape: it is currently predicated on an arrangement between an individual and the organisation that holds their data.
That is an inherently unbalanced power construct. As the noble Baroness said, as tech companies become larger and more powerful, it is not surprising that many individuals feel overwhelmed by the task of questioning or challenging those that are processing their personal information. It assumes a degree of knowledge about their rights and a degree of digital literacy, which we know many people do not possess.
In the very good debate that we had on digital exclusion a few weeks ago, it was highlighted that around 2.4 million people are unable to complete a single basic task to get online, such as opening an internet browser, and that more than 5 million employed adults cannot complete essential digital work tasks. These individuals cannot be expected to access their digital data on their own; they need the safety of a larger group to do so. We need to protect the interests of an entire group that would otherwise be locked out of the system.
The noble Baroness referred to the example of Uber drivers who were helped by their trade union to access their data, sharing patterns of exploitation and subsequently strengthening their employment package, but this does not have to be about just union membership; it could be about the interests of a group of public sector service users who want to make sure that they are not being discriminated against, a community group that wants its bid for a local grant to be treated fairly, and so on. We can all imagine examples of where this would work in a group’s interest. As the noble Baroness said, these proposals would allow any group of people to assign their rights—rights that are more powerful together than apart.
There could be other benefits; if data controllers are concerned about the number of individual requests that they are receiving for data information—and a lot of this Bill is supposed to address that extra work—group requests, on behalf of a data community, could provide economies of scale and make the whole system more efficient.
Like the noble Baroness, I can see great advantages from this proposal; it could lay the foundation for other forms of data innovation and help to build trust with many citizens who currently see digitalisation as something to fear—this could allay those fears. Like the noble Lord, Lord Clement-Jones, I hope the Minister can provide some reassurance that the Government welcome this proposal, take it seriously and will be prepared to work with the noble Baroness and others to make it a reality, because there is the essence of a very good initiative here.
I thank the noble Baroness, Lady Kidron, for raising this interesting and compelling set of ideas. I turn first to Amendments 10 and 35 relating to data communities. The Government recognise that individuals need to have the appropriate tools and mechanisms to easily exercise their rights under the data protection legislation. It is worth pointing out that current legislation does not prevent data subjects authorising third parties to exercise certain rights. Article 80 of the UK GDPR also explicitly gives data subjects the right to appoint not-for-profit bodies to exercise certain rights, including their right to bring a complaint to the ICO, to appeal against a decision of the ICO or to bring legal proceedings against a controller or processor and the right to receive compensation.
The concept of data communities exercising certain data subject rights is closely linked with the wider concept of data intermediaries. The Government recognise the existing and potential benefits of data intermediaries and are committed to supporting them. However, given that data intermediaries are new, we need to be careful not to distort the sector at such an early stage of development. As in many areas of the economy, officials are in regular contact with businesses, and the data intermediary sector is no different. One such engagement is the DBT’s Smart Data Council, which includes a number of intermediary businesses that advise the Government on the direction of smart data policy. The Government would welcome further and continued engagement with intermediary businesses to inform how data policy is developed.
I am sorry, but the Minister used a pretty pejorative word: “distort” the sector. What does he have in mind?
I did not mean to be pejorative; I merely point out that before embarking on quite a far-reaching policy—as noble Lords have pointed out—we would not want to jump the gun prior to consultation and researching the area properly. I certainly do not wish to paint a negative portrait.
It is a moment at which I cannot set a firm date for a firm set of actions, but on the other hand I am not attempting to punt it into the long grass either. The Government do not want to introduce a prescriptive framework without assessing potential risks, strengthening the evidence base and assessing the appropriate regulatory response. For these reasons, I hope that for the time being the noble Baroness will not press these amendments.
The noble Baroness has also proposed Amendments 147 and 148 relating to the role of the Information Commissioner’s Office. Given my response just now to the wider proposals, these amendments are no longer necessary and would complicate the statute book. We note that Clause 35 already includes a measure that will allow the Secretary of State to request the Information Commissioner’s Office to publish a code on any matter that she or he sees fit, so this is an issue we could return to in future if such a code were deemed necessary.
My Lords, I am sorry to keep interrupting the Minister. Can he give us a bit of a picture of what he has in mind? He said that he did not want to distort things at the moment, that there were intermediaries out there and so on. That is all very well, but is he assuming that a market will be developed or is developing? What overview of this does he have? In a sense, we have a very clear proposition here, which the Government should respond to. I am assuming that this is not a question just of letting a thousand flowers bloom. What is the government policy towards this? If you look at the Hall-Pesenti review and read pretty much every government response—including to our AI Select Committee, where we talked about data trusts and picked up the Hall-Pesenti review recommendations —you see that the Government have been pretty much positive over time when they have talked about data trusts. The trouble is that they have not done anything.
Overall, as I say and as many have said in this brief debate, this is a potentially far-reaching and powerful idea with an enormous number of benefits. But the fact that it is far-reaching implies that we need to look at it further. I am afraid that I am not briefed on long-standing—
May I suggest that the Minister writes? On the one hand, he is saying that we will be distorting something—that something is happening out there—but, on the other hand, he is saying that he is not briefed on what is out there or what the intentions are. A letter unpacking all that would be enormously helpful.
I am very happy to write on this. I will just say that I am not briefed on previous government policy towards it, dating back many years before my time in the role.
It was even further. Yes, I am very happy to write on that. For the reasons I have set out, I am not able to accept these amendments for now. I therefore hope that the noble Baroness will withdraw her amendment.
(9 months ago)
Lords ChamberTo ask His Majesty’s Government what steps they are taking to promote the use of human-specific medical research techniques, such as “organ-on-a-chip” and computer modelling, in place of animal testing.
The Government provide significant funding for the development of these technologies through UKRI, primarily to the National Centre for the Replacement, Reduction and Refinement of Animals in Research. We are doubling our investment in this area next year to £20 million and this summer the Government will publish a plan to accelerate the development, validation and uptake of methods to reduce reliance on the use of animals in science.
I thank the noble Lord the Minister for his Answer, but of course animal testing is not working well. Less than 6% of cancer drugs proceed past the first small phase 1 trials, and more than 99% of Alzheimer’s drugs have failed. There are some very exciting possibilities, such as the liver-on-a-chip device that correctly identified 87% of drugs that caused liver toxicity after they passed animal tests. Many other countries are racing ahead on this: the USA has passed the FDA Modernization Act, the Netherlands has a transition programme and India has new rules for drug trials. Do we not need to go much further and look towards legislative change and a much bigger injection of funds to see real progress if we are to be world-leading in the future in this biotechnology field?
That is a wide-ranging question, and I will do my best to cover some of those points. With respect to the effectiveness of clinical trials, on the whole they cannot take place without toxicology trials and most of those, sadly, have to be done on animals. We very much welcome any technology that allows for in silico methods of assessing toxicology and it is true that more of those are emerging, but they have to be validated in order to be assumed safe and usable in clinical trials.
My Lords, the Government produced a previous report on a road map for non-animal technologies from six UK government funders, including MRC, EPSRC and Innovate UK way back in 2015. How will they ensure that this new road map does not get left on the shelf again? Will DSIT set up an independent strategic advisory board with the key stakeholders to provide direction and oversight, as suggested by the RSPCA?
DSIT continues to be led on its approach to creating non-animal methods in clinical trials, toxicology trials and so on by the UK’s NC3Rs—the National Centre for the Replacement, Refinement and Reduction of Animals in Research—for toxicology and other scientific research, and that continues. There was a decrease of 10% in animal testing from the previous year, according to our most recent records, and that will continue. DSIT meanwhile has no plans to add a new oversight executive body to those already in existence.
My Lords, I express an interest as a past chairman of NC3Rs. During the time I was chairman, we saw a marked reduction in the number of animals used in research, and that continues with certain types of animals, such as dogs, cats and so on. It is essential, though, for new drugs to be tested on animals and regulatory authorities rely on that. Is there anything we can do to help those authorities relax a little?
First, let me pay tribute to the work of the NC3Rs, which is an extremely important body. Nobody feels comfortable doing a lot of animal tests; they simply are necessary for human safety in too many cases. For example, UK REACH follows the last-resort principle where, as far as possible, it is able to waive animal tests for chemicals. That kind of work will further accelerate the work of the NC3Rs.
My Lords, the noble Baroness, Lady Bennett, spoke about other countries that were looking at alternatives to animal testing. What conversations has my noble friend’s department had with other countries on how they can encourage more alternatives to animal testing?
DSIT continues to engage on life sciences research with a wide range of other countries, including countries that have tried to accelerate further. Recently, in particular, the Netherlands and the United States have not always been able to succeed in their goals of accelerating the date by which non-animal methods of research become the only way forward. On the other hand, steady progress towards the greater use of non-animal methods through the three Rs seems to be bearing fruit, albeit not as fast as anybody would like.
My Lords, we know that there is a fast-growing global market for human-specific technologies. The size of that global market in 2023 was around $2 billion, so it is huge. Does the Minister have any views on the economic potential of human-specific technologies for the UK as a leader in this field?
Yes, indeed; the economic potential is absolutely enormous. As with any medical devices, they need to be put through proper pharma-covigilance procedures, validation and testing, to make sure that by the time we are ready for clinical trials, all the toxicology testing has been properly done. Where it is possible to find an alternative to animal testing, that should always be followed. We always aim to use the minimum number of animals for the scientific benefit to be achieved and minimise the potential harm to animals for that benefit.
My Lords, in responding to me the Minister referred to the apparent necessity of animals for toxicity tests. Of course, the case I had cited was one where liver drugs had passed animal toxicity tests and then were found to have problems with a human-specific technology. Canada has passed a Bill to phase out animal-based chemical toxicity testing and the European Commission is committed to developing a road map in that direction. As the noble Baroness on the Front Bench said, human-specific technologies have enormous potential. Will the Government look at getting an Act to provide a framework so that the UK could get ahead in this area and end toxicology testing on animals, as other countries are looking to do?
The noble Baroness mentioned an Act—there are widespread protections under the Animals (Scientific Procedures) Act. We have the three-tier licensing system, including significant training and assessment for licensees, and a range of other safeguards. Different jurisdictions are taking a range of approaches to this; I am not aware of any jurisdiction that has yet been able to set a timeline for the absolute removal of animal tests because, sadly, they do remain critical for the development of medicines.
(10 months ago)
Lords ChamberMy Lords, I draw the attention of the House to my role as chair of Big Brother Watch and beg leave to ask the Question standing in my name on the Order Paper.
Preserving individuals’ rights to freedom of expression underpins all the Government’s work on tackling disinformation. This right is upheld by the Online Safety Act, which protects freedom of expression by addressing only the most egregious forms of disinformation, ensuring that people can engage in free debate and discussion online. Under the Act, when putting in place safety measures to fulfil their duties, companies are also required to consider and implement safeguards for freedom of expression.
I thank the Minister for his reply. Last year, Big Brother Watch exposed worrying overreach by the Counter Disinformation Unit in its attempts to prevent legitimate criticism of the Government by MPs, journalists and academics. Following the Government’s apology, could the Minister tell the House what, if anything, has changed, apart from the unit’s name? Could he please explain why the Government refuse to allow the Intelligence and Security Committee to oversee the work of what is now called the National Security Online Information Team?
First, the Counter Disinformation Unit has indeed changed its name to the National Security Online Information Team, to better reflect its role. I am not aware of the apology to which the noble Lord refers, but I will look into it. I have not heard of it. The NSOIT, as it is now called, does not target individuals, particularly not politicians or journalists. It does not even go after individual pieces of content but looks for trends across all items of content online. I will look into this case for an apology, but I am surprised by it because I am not aware of it.
My Lords, the Question of the noble Lord, Lord Strasburger, requires a little further interrogation, because that report by Big Brother Watch suggested that during the pandemic, politicians, journalists and civil society campaigners from across the political spectrum were personally targeted for critiquing the Government’s handling of the pandemic. Given that report and these legitimate concerns, it would be very kind if the Minister and his colleagues would look into this further and write to the noble Lord, Lord Strasburger, and, indeed, to anyone else affected.
Yes, I am very happy to write any such letter. I confirm now in front of the House that the function of the NSOIT, formerly the Counter Disinformation Unit, is to analyse attempts to artificially manipulate the information environment for purposes of national security. It is not its function—and never has been its function, regardless of its name—to go after individuals, whether they are politicians, journalists, or anybody else. It looks for at-scale attempts to manipulate the information environment.
My Lords, it is clear we need to be assured that the rather concerning activities reported about the CDU treating political criticism as disinformation are no longer practised by NSOIT. Can the Minister explain where we can find a copy of NSOIT’s policies? Can he confirm whether it has a policy to prohibit it from flagging lawful domestic speech for terms of service violations to social media companies?
Information on NSOIT is posted on GOV.UK, and I am happy to share that location with the noble Lord. I can confirm not only that it is not the role of NSOIT or the CDU to go after any individuals, regardless of their political belief, but that it never has been. NSOIT looks for large-scale attempts to pollute the information environment, generally as a result of threats from foreign states. I am happy to say in front of the House that the idea that its purpose is also to go after, in some ways, those who disagree politically with the Government is categorically false.
My Lords, the issue is much more complex than that. I am concerned that the unit to which the Minister referred seems to be concerned only about security issues now. In December, I asked the Minister about the rise of political deepfakes, which often originate from overseas and have the potential to undermine trust in political leaders and our wider democratic processes. With the Data Protection and Digital Information Bill currently before the House already containing measures on what the Government call “democratic engagement”, can I tempt the Minister to bring forward new anti-deepfake provisions to help preserve the integrity of our upcoming general election—and not just our election in a year of big elections?
Indeed. It is worth reminding the House that close to 2 billion people will go to the polls over this calendar year. A great many of those elections in which they participate will come under attack from malign foreign influences. Therefore, we have implemented the Defending Democracy Taskforce, chaired by the Security Minister, which set up a new unit last year specifically dedicated to safeguarding our coming election, whenever it may be. It continues to engage with various committees of Parliament and with the Electoral Commission. We will look carefully at any proposals on deepfake provisions in the DPDI Bill. Deepfakes are already illegal today if they violate either the foreign interference offence or the false communications offence.
My Lords, my noble friend Lord Strasburger asked about the parliamentary scrutiny of the unit. Does the Minister understand that, if there were to be proper scrutiny of the unit, some of the words that he uses to try to placate your Lordships’ House would have deeper resonance? Can he tell us why the ISC is not scrutinising the unit?
NSOIT is indeed scrutinised by Ministers; it sits within DSIT and then Ministers, as we see, come before this House to explain matters. As a national security team, I dare say that we would have some concerns about a standing report to Parliament about its activities, but I can continue to reassure the House on its role.
My Lords, can my noble friend the Minister explain how this very interesting unit is comprised? Who are the members of the unit and from where do they come?
The unit comprises civil servants who sit within DSIT, and it occasionally makes use of external consulting services. It adjusts its size and membership from within the DSIT team according to the nature of the threat at any given moment.
My Lords, on transparency: we would not know about the Counter Disinformation Unit if it was not for Big Brother Watch, which we owe great thanks for its service on that. The Minister seems to know what disinformation is. Can the Government tell us how they identify what is to be labelled as disinformation? Who checks the fact checkers? For example, BBC Verify seems keen to expose everybody else’s disinformation but seems blind to its own egregious examples of inaccurate information.
Well, the Government are clear, as is NSOIT, that disinformation refers to the deliberate attempt to mislead by placing falsehoods into the information environment. As part of the Civil Service, NSOIT would have robust internal measures to verify and check its own work, and indeed it reports regularly across government and to Ministers.
My Lords, can my noble friend the Minister explain what guidance is given to the unit to distinguish between disinformation and difference of opinion?
Disinformation is a deliberate falsehood. A difference of opinion is generally something of democratic importance or of journalistic or pluralistic importance, which it is very important to protect and which the Online Safety Act took very considerable measures to safeguard over its passage.
My Lords, does this unit check on government disinformation such as the Rwanda Bill?
I do not believe that this unit has been working on the Rwanda Bill.
My Lords, if this unit consists of civil servants and external advisers, why is it impermissible for its work to be supervised by a parliamentary committee composed of privy counsellors?
It was set up as an internal part of DSIT. It reports to Ministers and Ministers provide the oversight. I take the point, but it is a national security institution and, as such, the Government have a strong preference for not allowing it openly to share national security information for fear of benefiting those who wish us harm.