(1 day, 14 hours ago)
Lords ChamberMy Lords, government Amendment 18—
I call on the noble Lord, Lord Clement-Jones, to speak to Amendment 17.
Amendment 17
I apologise for interrupting the Minister, in what sounded almost like full flow. I am sure that he was so eager to move his amendment.
In moving Amendment 17, I will speak also to Amendment 21. These aim to remove the Secretary of State’s power to override primary legislation and modify key aspects of the UK data protection law via statutory instruments. They are similar to those proposed by me to the previous Government’s Data Protection and Digital Information Bill, which the noble Baroness, Lady Jones of Whitchurch, then in opposition, supported. These relate to Clauses 70(4) and 71(5).
There are a number of reasons to support accepting these amendments. The Delegated Powers and Regulatory Reform Committee has expressed concerns about the broad scope of the Secretary of State’s powers, as it did previously in relation to the DBS scheme. It recommended removing the power from the previous Bill, and in its ninth report it maintains this view for the current Bill. The Constitution Committee has said likewise; I will not read out what it said at the time, but I think all noble Lords know that both committees were pretty much on the same page.
The noble Baroness, Lady Jones, on the previous DPDI Bill, argued that there was no compelling reason for introducing recognised legitimate interests. On these Benches, we agree. The existing framework already allows for data sharing with the public sector and data use for national security, crime detection and safeguarding vulnerable individuals. However, the noble Baroness, in her ministerial capacity, argued that swift changes might be needed—hence the necessity for the Secretary of State’s power. Nevertheless, the DPRRC’s view is that the grounds for the lawful processing of personal data are fundamental and should not be subject to modification by subordinate legislation.
The letter from the Minister, the noble Lord, Lord Vallance, to the Constitution Committee and the DPRRC pretty much reiterates those arguments. I will not go through all of it again, but I note, in closing, that in his letter he said:
“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest”.
He could have come forward with an amendment to that effect at any point in the passage of the Bill, but he has not. I hope that, on reflection—in the light of both committees’ repeated recommendations, the potential threats to individual privacy and data adequacy, and the lack of strong justification for these powers—the Minister will accept these two amendments. I beg to move.
My Lords, I must inform the House that if Amendment 17 is agreed to, I cannot call Amendment 18 for reasons of pre-emption.
My Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.
Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.
However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.
The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.
It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.
Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.
In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.
I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.
Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.
My Lords, I will speak to Amendment 24 in my name and in the names of the noble Lords, Lord Clement-Jones and Lord Stevenson, and my noble friend Lord Black of Brentwood, all of whom I want to thank for their support. I also welcome government Amendment 49.
Amendment 24 concerns the use of the open electoral register, an issue we debated last year in considering the Data Protection and Digital Information Bill, and through the course of this Bill. Noble Lords may think this a small, technical and unimportant issue—certainly at this time of the evening. I have taken it on because it is emblematic of the challenge we face in this country in growing our economy.
Everyone wants strong economic growth. We know that the Government do. We know that the Chancellor has been challenging all regulators to come up with ideas to create growth. This is an example of a regulator hampering growth, and we in this House have an opportunity to do something about it. Those of us who have run businesses know that often, it is in the detail of the regulation that the dead hand of the state does its greatest damage. Because each change is very detailed and affects only a tiny part of the economy, the changes get through the bureaucracy unnoticed and quietly stifle growth. This is one of those examples.
My Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.
I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.
There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.
On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.
I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would
“increase … annual donations in the UK by £290 million”.
Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:
“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.
Oxfam’s individual engagement director noted:
“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.
Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.
My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.
I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.
I do not want to try the patience of the House at this late hour. I am unhappy about Clause 77 as a whole. Had I had the opportunity, we could have debated it in Committee; unfortunately, I was double-booked, so was unable. Now we are on Report, which does not really provide a platform for discussing the exclusion of the clause.
However, the noble Baroness has provided an opportunity for me to make the point that combining data is the weak point, the point at which we lose control. For that reason, I am unhappy about this amendment. We need to keep high levels of vigilance with regard to the ability to take data from one area and apply it in another, because that is when personal privacy disappears.
My Lords, as we reach the end of this important group, I thank particularly my noble friend Lady Harding for her contribution and detailed account of some of the issues being faced, which I found both interesting and valuable. I thought the example about the jazz concert requiring the combination of those different types of data was very illuminating. These proposed changes provide us the opportunity to carefully balance economic growth with the fundamental right to data privacy, ensuring that the Bill serves all stakeholders fairly.
Amendment 24 introduces a significant consideration regarding the use of the open electoral register for direct marketing purposes. The proposal to include data from the OER, combined with personal data from other sources, to build marketing profiles creates a range of issues that require careful consideration.
Amendment 24 stipulates that transparency obligations must be fulfilled when individuals provide additional data to a data provider, and that this transparency should be reflected both in the privacy policy and via a data notification in a direct mail pack. While there is certainly potential to use the OER to enhance marketing efforts and support economic activity, we have to remain vigilant to the privacy implications. We need to make sure that individuals are informed of how and where their OER data is being processed, especially when it is combined with other data sources to build profiles.
The requirement for transparency is a positive step, but it is essential that these obligations are fully enforced and that individuals are not left in the dark about how their personal information is being used. I hope the Minister will explain a little more about how these transparency obligations will be implemented in practice and whether additional safeguards are proposed.
Amendment 49 introduces a change to Regulation 22, creating an exception for charities to use electronic mail for direct marketing in specific circumstances. This amendment enables charities to send direct marketing emails when the sole purpose is to further one or more of their charitable purposes, provided that certain conditions are met. These conditions include that the charity obtained the recipient’s contact details when the individual expressed interest in the charity or offered previous support for the charity. This provision recognises the role of charities in fundraising and that their need to communicate with volunteers, supporters or potential donors is vital for their work.
However, I understand the argument that we must ensure that the use of email marketing does not become intrusive or exploitative. The amendment requires that recipients are clearly informed about their right to refuse future marketing communications and that this option is available both when the data is first collected and with every subsequent communication. This helps strike the right balance between enabling charities to raise funds for their causes and protecting individuals from unwanted marketing.
I welcome the Government’s commitment to ensuring that charities continue to engage with their supporters while respecting individuals’ right to privacy. However, it is essential that these safeguards are robustly enforced to prevent exploitation. Again, I look forward to hearing from the Minister on how the Government plan to ensure that their provisions will be properly implemented and monitored.
Amendment 50 introduces the concept of soft opt-ins for email marketing by charities, allowing them to connect with individuals who have previously expressed interest in their charitable causes. This can help charities maintain and grow their supporter base but, again, we must strike the right balance with the broader impact this could have on people in receipt of this correspondence. It is crucial that any system put in place respects individuals’ right to privacy and their ability to opt out easily. We must ensure that charities provide a clear, simple and accessible way for individuals to refuse future communications, and that this option is consistently available.
Finally, we should also consider the rules governing the use of personal data by political parties. This is, of course, an area where we must ensure that transparency, accountability and privacy are paramount. Political parties, like any other organisation, must be held to the highest standards in their handling of personal data. I hope the Government can offer some clear guidance on improving and strengthening the rules surrounding data use by political parties to ensure that individuals’ rights are fully respected and protected.
My Lords, I now turn to government Amendment 49. I thank the noble Lord, Lord Clement-Jones, and other noble Lords for raising the concerns of the charity sector during earlier debates. The Government have also heard from charities and trade associations directly.
This amendment will permit charities to send marketing material—for example, promoting campaigns or fundraising activities—to people who have previously expressed an interest in their charitable purposes, without seeking express consent. Charities will have to provide individuals with a simple means of opting out of receiving direct marketing when their contact details are collected and with every subsequent message sent. The current soft opt-in rule for marketing products and services has similar requirements.
Turning to Amendment 24, I am grateful to the noble Baroness, Lady Harding, for our discussions on this matter. As was said in the debate in Grand Committee, the Government are committed to upholding the principles of transparency. I will try to outline some of that.
I understand that this amendment is about data brokers buying data from the open electoral register and combining it with data they have collected from other sources to build profiles on individuals with the intention of selling them for marketing. Despite what was said in the last debate on this, I am not convinced that all individuals registering on the open electoral register would reasonably expect this kind of profiling or invisible processing using their personal data. If individuals are unaware of the processing, this undermines their ability to exercise their other rights, such as to object to the processing. That point was well made by the noble Lord, Lord Davies.
With regard to the open electoral register, the Government absolutely agree that there are potential benefits to society through its use—indeed, economic growth has been mentioned. Notification is not necessary in all cases. There is, for example, an exemption if notifying the data subject would involve a disproportionate effort and the data was not collected directly from them. The impact on the data subject must be considered when assessing whether the effort is disproportionate. If notification is proportionate, the controller must notify.
The ICO considers that the use and sale of open electoral register data alone is unlikely to require notification. As was set out in Committee, the Government believe that controllers should continue to assess on a case-by-case basis whether cases meet the conditions for the existing disproportionate effort exemption. Moreover, I hope I can reassure the noble Baroness that in the event that the data subject already has the information—from another controller, for example—another exemption from notification applies.
The Government therefore do not see a case for a new exemption for this activity, but as requested by the noble Baroness, Lady Harding, I would be happy to facilitate further engagement between the industry and the ICO to improve a common understanding of how available exemptions are to be applied on a case-by-case basis. I understand that the ICO will use the Bill as an opportunity to take stock of how its guidance can address particular issues that organisations face.
Amendment 50, tabled by the noble Lord, Lord Clement-Jones, seeks to achieve a very similar thing to the government amendment and we studied it when designing our amendment. The key difference is that the government amendment defines which organisations can rely on the new measure and for what purposes, drawing on definitions of “charity” and “charitable purpose” in relevant charities legislation.
I trust that the noble Lord will be content with this government amendment and feel content to not to press his own.
Before the Minister sits down, can I follow up and ask a question about invisible processing? I wonder whether he considers that a better way of addressing potential concerns about invisible processing is improving the privacy notices when people originally sign up for the open electoral register. That would mean making it clear how your data could be used when you say you are happy to be on the open electoral register, rather than creating extra work and potentially confusing communication with people after that. Can the Minister confirm that that would be in scope of potential options and further discussions with the ICO?
The further discussions with the ICO are exactly to try to get to these points about the right way to do it. It is important that people know what they are signing up for, and it is equally important that they are aware that they can withdraw at any point. Those points obviously need to be discussed with the industry to make sure that everyone is clear about the rules.
I thank noble Lords for having humoured me in the detail of this debate. I am very pleased to hear that response from the Minister and look forward to ongoing discussions with the ICO and the companies involved. As such, I beg leave to withdraw my amendment.
My Lords, I rise to speak to Amendments 26, 31 and 32 tabled in my name and that of my noble friend Lord Markham. I will address the amendments in reverse order.
Amendment 32 would ensure that, where a significant decision is taken by ADM, the data subject was able to request intervention by a human with sufficient competency and authority. While that is clearly the existing intent of the ADM provisions in the Bill, this amendment brings further clarity. I am concerned that, where data processors update their ADM procedures in the light of this Bill, it should be abundantly clear to them at every stage what the requirements are and that, as currently written, there may be a risk of misunderstanding. Given the significance of decisions that may be made by ADM, we should make sure this does not happen. Data subjects must have recourse to a person who both understands their problem and is able to do something about it. I look forward to hearing the Minister’s views on this.
Amendment 31 would require the Secretary of State to provide guidance on how consent should be obtained for ADM involving special category data. It would also ensure that this guidance was readily available and reviewed frequently. The amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation.
We all know that litigation can be slow, disruptive and sometimes prohibitively expensive. If we want to encourage the use of ADM so that customers and businesses can save both time and money, we should seek to ensure that the sector does not become a hotbed of litigation. The risk can be mitigated by providing ample guidance for the sector. For relatively minimal effort on the part of the Secretary of State, we may be able to facilitate substantial growth in the use and benefits of ADM. I would be most curious to hear the Minister’s opinions on this matter and, indeed, the opinions of noble Lords more broadly.
Amendment 26 would insert the five principles set out in the AI White Paper published by the previous Government, requiring all data controllers and processors who partake in AI-driven ADM to have due regard for them. In the event that noble Lords are not familiar with these principles, they are: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real, and popular, safeguards against the risks of AI while continuing to foster innovation.
My Lords, I will speak to Amendments 28, 29, 33, 34 and 36. I give notice that I will only speak formally to Amendment 33. For some reason, it seems to have escaped this group and jumped into the next one.
As we discussed in Committee, and indeed on its previous versions, the Bill removes the general prohibition on solely automated decisions and places the responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. The Bill also amends Article 22 of the GDPR so that protection against solely automated decision-making applies only to decisions made using sensitive data such as race, religion and health data. This means that decisions based on other personal data, such as postcode, nationality, sex or gender, would be subject to weaker safeguards, increasing the risk of unfair or discriminatory outcomes. This will allow more decisions with potentially significant impacts to be made without human oversight, even if they do not involve sensitive data. This represents a significant weakening of existing protection against unsafe automated decision-making. That is why I tabled Amendment 33 to leave out the whole clause.
However, the Bill replaces the existing Article 22 with Articles 22A to 22D, which redefine automated decisions and allow for solely automated decision-making in a broader range of circumstances. This change raises concerns about transparency and the ability of individuals to challenge automated decisions. Individuals may not be notified about the use of ADM, making it difficult to exercise their rights. Moreover, the Bill’s safeguards for automated decisions, particularly in the context of law enforcement, are weaker compared with the protections offered by the existing Article 22. This raises serious concerns about the potential for infringement of people’s rights and liberties in areas such as policing, where the use of sensitive data in ADM could become more prevalent. Additionally, the lack of clear requirements for personalised explanations about how ADM systems reach decisions further limits individuals’ understanding of and ability to challenge outcomes.
In the view of these Benches, the Bill significantly weakens safeguards around ADM, creates legal uncertainty due to vague definitions, increases the risk of discrimination, and limits transparency and redress for individuals—ultimately undermining public trust in the use of these technologies. I retabled Amendments 28, 29, 33 and 34 from Committee to address continuing concerns regarding these systems. The Bill lacks clear definitions of crucial terms such as “meaningful human involvement” and, similarly, “significant effect”, which are essential for determining the scope of protection. That lack of clarity could lead to varying interpretations and inconsistencies in application, creating legal uncertainty for individuals and organisations.
In Committee, the noble Baroness, Lady Jones, emphasised the Government’s commitment to responsible ADM and argued against defining meaningful human involvement in the Bill, but instead for allowing the Secretary of State to define those terms through delegated legislation. However, that raises concerns about transparency and parliamentary oversight, as these are significant policy decisions. Predominantly automated decision-making should be included in Clause 80, as in Amendment 28, as a decision may lack meaningful human involvement and significantly impact individuals’ rights. The assertion by the noble Baroness, Lady Jones, that predominantly automated decisions inherently involve meaningful human oversight can be contested, particularly given the lack of a clear definition of such involvement in the Bill.
There are concerns that changes in the Bill will increase the risk of discrimination, especially for marginalised groups. The noble Baroness, Lady Jones, asserted in Committee that the data protection framework already requires adherence to the Equality Act. However, that is not enough to prevent algorithmic bias and discrimination in ADM systems. There is a need for mandatory bias assessments of all ADM systems, particularly those used in the public sector, as well as for greater transparency in how those systems are developed and deployed.
We have not returned to the fray on the ATRS, but it is clear that a statutory framework for the ATRS is necessary to ensure its effectiveness and build trust in public sector AI. Despite the assurance by the noble Baroness, Lady Jones, that the ATRS is mandatory for government departments, its implementation relies on a cross-government policy mandate that lacks statutory backing and may prove insufficient to ensure the consistent and transparent use of algorithmic tools.
My Amendment 34 seeks to establish requirements for public sector organisations using ADM systems. Its aim is to ensure transparency and accountability in the use of these systems by requiring public authorities to publish details of the systems they use, including the purpose of the system, the data used and any mitigating measures to address risks. I very much welcome Amendment 35 from the noble Baroness, Lady Freeman, which would improve it considerably and which I have also signed. Will the ATRS do as good a job as that amendment?
Concerns persist about the accessibility and effectiveness of this mechanism for individuals seeking redress against potentially harmful automated decisions. A more streamlined and user-friendly process for challenging automated decisions is needed in the in the age of increasing ADM. The lack of clarity and specific provisions in the Bill raises concerns about its effectiveness in mitigating the risks posed by automated systems, particularly in safeguarding vulnerable groups such as children.
My Amendment 36 would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the Information Commissioner’s Office, or to clearly set out their reasoning as to why that is not required within six months of the Act passing. The amendment is aimed at addressing the ambiguity surrounding “meaningful human involvement” and ensuring that there is a clear understanding of what constitutes appropriate human oversight in ADM processes.
I am pleased that the Minister has promised a code of practice, but what assurance can he give regarding the forthcoming ICO code of practice about automated decision-making? How will it provide clear guidance on how to implement and interpret the safeguards for ADM, and will it address the definition of meaningful human involvement? What forms of redress will it require to be established? What level of transparency will be required? A code of conduct offered by the Minister would be acceptable, provided that the Secretary of State did not have the sole right to determine the definition of meaningful human involvement. I therefore hope that my Amendment 29 will be accepted alongside Amendment 36, because it is important that the definition of such a crucial term should be developed independently, and with the appropriate expertise, to ensure that ADM systems are used fairly and responsibly, and that individual rights are adequately protected.
Amendments 31 and 32 from the Opposition Front Bench seem to me to have considerable merit, particularly Amendment 32, in terms of the nature of the human intervention. However, I confess to some bafflement as to the reasons for Amendment 26, which seeks to insert the OECD principles set out in the AI White Paper. Indeed, they were the G20 principles as well and are fully supportable in the context of an AI Bill, for instance, and I very much hope that will form Clause 1 of a new AI Bill going forward. I am not going to go into great detail, but I wonder whether those principles are already effectively addressed in data protection legislation. If we are not careful, we are going to find a very confused regulator in these circumstances. So, although there is much to commend the principles as such, whether they are a practical proposition in a Bill of this nature is rather moot.
My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).
Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.
In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.
In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.
The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.
My Lords, I have added my name to Amendment 36 tabled by the noble Lord, Lord Clement-Jones. I also support Amendments 26, 27, 28, 31, 32 and 35. The Government, in their AI Statement last week, said that ADM will be rolled out across the public sector in the coming months and years. It will increase productivity and provide better public services to the people of this country.
However, there are many people who are fearful of their details being taken by an advanced computer, and a decision which could affect their lives being made by that computer. Surely the days of “computer says no” must be over. People need to know that there is a possibility of a human being involved in the process, particularly when dealing with the public sector. I am afraid that my own interactions with public sector software in various government departments have not always been happy ones, and I have been grateful to be able to appeal to a human.
My Lords, I support what the noble Baroness, Lady Freeman, said. Her maiden speech was a forewarning of how good her subsequent speeches would be and how dedicated she is to openness, which is absolutely crucial in this area. We are going to have to get used to a lot of automatic processes and come to consider that they are by and large fair. Unless we are able to challenge it, understand it and see that it has been properly looked after, we are not going to develop that degree of trust in it.
Anyone who has used current AI programs will know about the capacity of AI for hallucination. The noble Lord, Lord Clement-Jones, uses them a lot. I have been looking, with the noble Lord, Lord Saatchi, at how we could use them in this House to deal with the huge information flows we have and to help us understand the depths of some of the bigger problems and challenges we are asked to get a grip on. But AI can just invent things, leaping at an answer that is easier to find, ignoring two-thirds of the evidence and not understanding the difference between reliable and unreliable witnesses.
There is so much potential, but there is so much that needs to be done to make AI something we can comfortably rely on. The only way to get there is to be absolutely open and allow and encourage challenge. The direction pointed out by the noble Lord, Lord Clement-Jones, and, most particularly by the noble Baroness, Lady Freeman, is one that I very much think we should follow.
My Lords, I will very briefly speak to Amendment 30 in my name. Curiously, it was in the name of the noble Viscount, Lord Camrose, in Committee, but somehow it has jumped.
On the whole, I have always advocated for age-appropriate solutions. The amendment refers to preventing children consenting to special category data being used in automated decision-making, simply because there are some things that children should not be able to consent to.
I am not sure that this exact amendment is the answer. I hope that the previous conversation that we had before the dinner break will produce some thought about this issue—about how automatic decision-making affects children specifically—and we can deal with it in a slightly different way.
While I am on my feet, I want to say that I was very struck by the words of my noble friend Lady Freeman, particularly about efficacy. I have seen so many things that have purported to work in clinical conditions that have failed to work in the complexity of real life, and I want to associate myself with her words and, indeed, the amendments in her name and that of the noble Lord, Lord Clement-Jones.
I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.
I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.
Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.
I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.
Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.
Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as
“freely given, specific, informed and unambiguous”
and
“as easy … to withdraw … as to give”.
So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.
I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.
Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.
The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,
“a service to rigorously test models and products before release”.
That function will be in place and available to departments.
On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.
My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.
I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.
Before the Minister sits down, he said that there will be evaluations of the efficacy of these systems but he did not mention whether those will have to be made public. Can he give me any assurance on that?
There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.
I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.
First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.
However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.
As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.
I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.
However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.
My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.
I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.
As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.
I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.
My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.
Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.
In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.
In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.
One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.
I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.
My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.
Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.
My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.
However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.
In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
Before the Minister sits down, I welcome his words, which are absolutely what we want to hear. I understand that the ICO is an independent regulator, but it is often the case that the scope and some of Parliament’s concerns are delivered to it from this House—or, indeed, from the other place. I wonder whether we could find an opportunity to make sure that the ICO hears Parliament’s wish on the scope of the children’s code, at least. I am sure the noble Lord, Lord Clement-Jones, will say similar on his own behalf.
It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.
My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.
I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.
My Lords, Amendment 37 is on the subject of data adequacy, which has been a consistent issue throughout the passage of the Bill. The mechanism put forward in the amendment would review the question of data adequacy.
Safeguarding data adequacy is crucial for the UK’s economy and international partnerships. Losing data adequacy status would impose significant costs and administrative burdens on businesses and public sector organisations that share data between the UK and the EU. It would also hinder international trade and economic co-operation, and trust in the UK’s digital economy, contradicting the Government’s objective of economic growth. I hope very much that the Government are proactively engaging with the European Commission to ensure a smooth adequacy renewal process this year.
Early engagement and reassurance about the retention of adequacy status are of crucial importance, given the looming deadline of June this year. This includes explaining and providing reassurance regarding any planned data protection reforms, particularly concerning the independence of the Information Commissioner’s Office, ministerial powers to add new grounds—for instance, recognised legitimate interest for data processing —and the new provisions relating to automated decision-making.
Despite assurances from the noble Baroness, Lady Jones, that the proposed changes will not dilute data subjects’ rights or threaten EU adequacy, proactive engagement with the EU and robust safeguards are necessary to ensure the continued free flow of data while maintaining high data protection standards. The emphasis on proportionality as a safeguard against the dilution of data subjects’ rights, as echoed by the noble Baroness, Lady Jones, and the ICO, is insufficient. The lack of a clear definition of proportionality within the context of data subjects’ rights could provide loopholes for controllers and undermine the essential equivalence required for data adequacy. The Bill’s reliance on the ICO’s interpretation of proportionality without explicit legislative clarity could be perceived as inadequate by the European Commission, particularly in areas such as web scraping for AI training.
The reassurance that the Government are taking data adequacy seriously and are committing to engaging with the EU needs to be substantiated by concrete actions. The Government do not, it appears, disclose assessments and reports relating to the compatibility of the UK’s domestic data protection framework with the Council of Europe’s Convention 108+, and that raises further concerns about transparency and accountability. Access to this information would enable scrutiny and informed debate, ultimately contributing to building trust and ensuring compatibility with international data protection standards.
In conclusion, while the Government maintain that this Bill would not jeopardise data adequacy, the concerns raised by myself and others during its passage mean that I continue to believe that a comprehensive review of EU data adequacy, as proposed in Amendment 37, is essential to ensure the continued free flow of data, while upholding high data protection standards and maintaining the UK’s position as a trusted partner in international data exchange. I beg to move.
I have added my name to this amendment, about which the noble Lord, Lord Clement-Jones, has spoken so eloquently, because of the importance to our economic growth of maintaining data adequacy with the EU. I have two points to add to what he said.
First, as I said and observed on some occasions in Committee, this is legislation of unbelievable complexity. It is a bad read, except if you want a cure for insomnia. Secondly, it has the technique of amending and reamending earlier legislation. Thirdly, this is not the time to go into detail of the legal problems that arise, some of which we canvassed in Committee, as to whether this legislation has no holes in it. I do not think I would be doing any favours either to the position of the United Kingdom or to those who have been patient enough to stay and listen to this part of the debate by going into any of those in any detail, particularly those involving the European Convention on Human Rights and the fundamental charter. That is my first point, on the inherent nature of the legislative structure that we have created. As I said earlier, I very much hope we will never have such legislation again.
Secondly, in my experience, there is a tendency among lawyers steeped in an area or department often to feel, “Well, we know it’s all right; we built it. The legislation’s fine”. Therefore, there is an additional and important safeguard that I think we should adopt, which is for a fresh pair of eyes, someone outside the department or outside those who have created the legislation, to look at it again to see whether there are any holes in it. We cannot afford to go into this most important assessment of data adequacy without ensuring that our tackle is in order. I appreciate what the Minister said on the last occasion in Committee—it is for the EU to pick holes in it—but the only prudent course when dealing with anything of this complexity in a legal dispute or potential dispute is to ensure that your own tackle is in order and not to go into a debate about something without being sure of that, allowing the other side to make all the running. We should be on top of this and that is why I very much support this amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.
I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.
However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?
It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.
I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.