Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateBaroness Jones of Whitchurch
Main Page: Baroness Jones of Whitchurch (Labour - Life peer)Department Debates - View all Baroness Jones of Whitchurch's debates with the Department for Business and Trade
(2 days, 14 hours ago)
Grand CommitteeMy Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.
The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:
“Where processing is conducted by a private body, that processing will not usually engage convention rights”.
As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.
We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:
“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.
Indeed, she put forward an amendment:
“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]
I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.
Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.
The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.
The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.
The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.
Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.
As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on
“a reasonable and proportionate search”.
The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.
Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,
“including to equality and the environment”,
when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.
Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.
As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.
My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.
We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.
I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.
Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.
The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:
“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.
The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?
I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.
I thank the Minister for her offer of a meeting. I could tell from the nods of my co-signatories that that would indeed be very welcome and we would all like to come. I was interested in the quote from the ICO about scraping. I doubt the Minister has it to hand, but perhaps she could write to say what volume of enforcement action has been taken by the ICO on behalf of data rights holders against scraping on that basis.
Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.
My Lords, I thank the Minister for her response. I wish to make three points. First, the critical question is: are our laws adequate to pass the adequacy test? Normally, when you go in for a legal test, you check that your own house is in order. I am therefore slightly disappointed by the response to Amendment 125. Normally one has the full-scale medical first, rather than waiting until you are found to be ill afterwards.
Secondly, I listened to what the Minister said about my Amendment 87 and the difference between what rights are protected by the charter and the much greater limitation of the ECHR, normally simply to do with the extent to which they apply horizontally to private individuals. I will look at her answer, but at first sight it does not seem right to me that, where you have fundamental rights, you move to a second stage of rights—namely, the rights under the Data Protection Act.
Thirdly, I want to comment on the whole concept of data communities and data trusts. This is an important area, and it takes me back to what I said last time: this legislation really needs trying to reduce to principles. I am going to throw out a challenge to the very learned people behind the Minister, particularly the lawyers: can they come up with something intelligible to the people who are going to do this?
This legislation is ghastly; I am sorry to say that, but it is. It imposes huge costs on SMEs—not to say on others, but they can probably afford it—and if you are going to get trust from people, you have to explain things in simple principles. My challenge to those behind the Minister is: can they draft a Clause 1 of the Bill to say, “The principles that underpin the Bill are as follows, and the courts are to interpret it in accordance with those principles”? That is my challenge—a challenge, as the noble Baroness, Lady Kidron, points out, to be ambitious and not to sit in a tepid bath. I beg leave to withdraw the amendment.
My Lords, I thank the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for their amendments and consideration of this policy area. I hope noble Lords will bear with me if I save some of the points I shall make on web crawling and intellectual property for the later group, which is specifically on that topic.
Amendments 92 and 93 from the noble Viscount are about the new disproportionate effort exemption in Article 13. I can reassure noble Lords that this exemption applies only when data is collected directly from the data subject, so it cannot be used for web crawling, which is, if you like, a secondary activity. I think that answers that concern.
Amendments 101 and 105, also from the noble Viscount, are about the changes to the existing exemption in Article 14, where data is collected from other sources. Noble Lords debated this issue in the previous group, where Amendments 97 and 99 sought to remove this exemption. The reassurances I provided to noble Lords in that debate about the proportionality test being a case-by-case exercise also apply here. Disproportionate effort cannot be used as an excuse; developers must consider the rights of the data subject on each occasion.
I also draw noble Lords’ attention to another quote from the ICO itself, made when publishing its recent outcome reports. I know I have already said that I will share more information on this. It says:
“Generative AI developers, it’s time to tell people how you’re using their information”.
The ICO is on the case on this issue, and is pursuing it.
On Amendment 137 from the noble Baronesses, Lady Kidron and Lady Harding, and other noble Lords, I fully recognise the importance of organisations receiving clear guidance from regulators, especially on complex and technical issues. AI is one such issue. I know that noble Lords are particularly conscious of how it might affect children, and I am hearing the messages about that today.
As the noble Baroness will know, the Secretary of State already has the power to request statutory codes such as this from the regulator. The existing power will allow us to ensure the correct scope of any future codes, working closely with the ICO and stakeholders and including noble Lords here today, and I am happy to meet them to discuss this further. The Government are, naturally, open to evidence about whether new statutory codes should be provided for by regulations in future. Although I appreciate the signal this can send, at the moment I do not believe that a requirement for codes on this issue is needed in this legislation. I hope noble Lords are reassured that the Government are taking this issue seriously.
Amendment 211A from the noble Lord, Lord Holmes, is about prohibiting the processing of people’s names, facial images, voices or any physical characteristics for AI training without their consent. Facial images and other physical characteristics that can be used to identify a person are already protected by the data protection legislation. An AI developer processing such data would have to identify a lawful ground for this. Consent is not the only option available, but I can reassure the noble Lord that there are firm safeguards in place for all the lawful grounds. These include, among many other things, making sure that the processing is fair and transparent. Noble Lords will know that even more stringent conditions, such as safeguards applying in relation to race, sexual orientation and any biometric data that can be used to identify someone as types of a special category of data are also covered.
Noble Lords tried to tempt me once again on the timetable for the AI legislation. I said as much as I could on that when we debated this in the last session, so I cannot add any more at this stage.
I hope that reassures noble Lords that the Bill has strong protections in place to ensure responsible data use and reuse, and, as such, that they feel content not to press their amendments.
I understand the point that the Secretary of State has the power, but does he have the intention? We are seeking an instruction to the ICO to do exactly this thing. The Secretary of State’s intention would be an excellent compromise all round to activate such a thing, and to see that in the Bill is the point here.
Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.
This has been a wide-ranging debate, with important contributions from across the Committee. I take some comfort from the Minister’s declaration that the exemptions will not be used for web crawling, but I want to make sure that they are not used at the expense of the privacy and control of personal data belonging to the people of Britain.
That seems particularly so for Amendment 137 in the name of the noble Baroness, Lady Kidron. I was particularly taken by her pointing out that children’s data privacy had not been taken into account when it came to AI, reinforced by the noble Baroness, Lady Harding, telling us about the importance of the Bill. She said it was paramount to protect children in the digital age and reminded us that this is the biggest breakthrough of our lifetime and that children need protecting from it. I hope very much that there will be some successful meetings, and maybe a government amendment on Report, responding to these passionate and heartfelt demands. On that basis, I sincerely hope the Minister will meet us all and other noble Lords to discuss these matters of data privacy further. On that basis, I beg leave to withdraw my amendment.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.
First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.
Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.
When does the Minister anticipate that the ICO will produce that report?
I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.
May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.
Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.
I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.
Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.
The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.
Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.
As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.
I apologise to the Minister for intervening on her when I have not spoken earlier in this debate, but I was reassured by what she just said on Amendment 162. Remarks made by other noble Lords in this debate suggest both that members of the public might not object to charities having the same access rights as businesses and that the public do not necessarily draw a distinction between businesses and charities. As a former chairman of the Charity Commission, I can say that that is not what is generally found. People have an expectation of charities that differs from what they would expect by way of marketing from businesses. In considering this amendment, therefore, I urge the Minister to think carefully before deciding what action the Government should take.
I thank the noble Baroness very much for that very helpful intervention. If she has any more information about the view of the Charity Commission, we would obviously like to engage with that because we need to get this right. We want to make sure that individuals welcome and appreciate the information given to them, rather than it being something that could have a negative impact.
I think I have covered all the issues. I hope those explanations have been of some reassurance to noble Lords and that, as such, they are content not to press their amendments.
May I just follow up by asking one quick question? I may be clutching at straws here but, in responding to the amendments in my name, she stated what the ICO believes rather than what the Government believe. She also said that the ICO may think that further permission is required to ensure transparency. I understand from the Data & Marketing Association that users of this data have four different ways of ensuring transparency. Would the Minister agree to a follow-up meeting to see whether there is a meeting of minds with what the Government think, rather than the ICO?
I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.
My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.
My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.
I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.
I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.
Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that
“the human involvement has to be active and not just a token gesture”;
that right is absolutely underpinned by the wording of the regulations here.
I turn next to Amendment 111. I can assure—
My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?
Yes. The provisions in this Bill cover exactly that concern.
The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?
Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.
On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.
On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.
Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.
On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.
I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.
I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.
Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.
I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.
My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.
That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:
“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.
He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.
The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.
Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.
On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.
On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.
I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?
Noble Lords have asked several questions. First, in response to the noble Viscount, Lord Camrose, I think I am on the same page as him about binary rather than muddying the water by having degrees of meaningful intervention. The ICO already has guidance on how human review should be provided, and this will be updated after the Bill to ensure that it reflects what is meant by “meaningful human involvement”. Those issues will be addressed in the ICO guidance, but if it helps, I can write further on that.
I have forgotten the question that the noble Lord, Lord Holmes, asked me. I do not know whether I have addressed it.
In her response the Minister said “notification”. Does she see notification as the same as “personalised response”?
My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.
Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.
On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.
Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.
On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.
As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.
Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.
I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.
My Lords, I would be very grateful if the Minister wrote to me about Amendment 115. I have done my best before and after to study Clause 80 to understand how it provides the safeguards she describes, and have failed. If she or her officials could take the example of a job application and the responses expected from it, and take me through the clauses to understand what sort of response would be expected and how that is set out in the legislation, I would be most grateful.
My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.
It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.
We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.
The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.
There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
Yes, we could not see the noble Lord’s raised eyebrows.
Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.
We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.
I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.
My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.
I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.
No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.
I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.
I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.
I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.
My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.
The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.
I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.
Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.
Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.
I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.
Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.
Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.
I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.
Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.
Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.
Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.
My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.
The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.
It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.
I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.
There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.