Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(7 months, 1 week ago)
Grand CommitteeMy Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.
The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.
The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.
New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.
If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.
As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.
Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:
“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]
That is very much my feeling about the clause as well.
I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.
In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.
Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.
Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.
There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.
My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.
The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:
“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]
but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.
The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?
As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?
I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?
On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.
I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like
“a cog in a machine or an Amazon worker with no agency or creativity”.
He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.
I am processing what the Minister has just said. He said it complements the AI regulation framework, and then he went on to talk about the central risk function, the AI risk register and what the ICO is up to in terms of guidance, but I did not hear that the loosening of safeguards or rights under Clause 14 and Article 22 of the GDPR was heralded in the White Paper or the consultation. Where does that fit with the Government’s AI regulation strategy? There is a disjunct somewhere.
I reject the characterisation of Clause 14 or any part of the Bill as loosening the safeguards. It focuses on the outcomes and by being less prescriptive and more adaptive, its goal is to heighten the levels of safety of AI, whether through privacy or anything else. That is the purpose.
On Secretary of State powers in relation to ADM, the reforms will enable the Government to further describe what is and is not to be taken as a significant effect on a data subject and what is and is not to be taken as meaningful human—
My Lords, I think we should try to let the Minister make a little progress and see whether some of these questions are answered.
I am sorry, but I just do not accept that intervention. This is one of the most important clauses in the whole Bill and we have to spend quite a bit of time teasing it out. The Minister has just electrified us all in what he said about the nature of this clause, what the Government are trying to achieve and how it fits within their strategy, which is even more concerning than previously. I am very sorry, but I really do not believe that this is the right point for the Whip to intervene. I have been in this House for 25 years and have never seen an intervention of that kind.
Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.
On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.
Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.
Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.
Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.
Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.
My Lords, I feel less reassured after this debate than I did even at the end of our two groups on Monday. I thank all those who spoke in this debate. There is quite a large number of amendments in this group, but a lot of them go in the same direction. I was very taken by what the noble Baroness, Lady Kidron, said: if the Government are offering safeguards and not rights, that is really extremely worrying. I also very much take on board what the noble Baroness, Lady Harding, had to say. Yes, of course we are in favour of automated decision-making, as it will make a big difference to our public services and quite a lot of private businesses, but we have to create the right ground rules around it. That is what we are talking about. We all very much share the question of children having a higher bar. The noble Baroness, Lady Jones, outlined exactly why the Secretary of State’s powers either should not be there or should not be expressed in the way that they are. I very much hope that the Minister will write on that subject.
More broadly, there are huge issues here. I think that it was the noble Baroness, Lady Kidron, who first raised the fact that the Government seem to be regulating in a specific area relating to AI that is reducing rights. The Minister talks about now regulating outcomes, not process. As the noble Baroness, Lady Jones, said, we do not have any criteria—what KPIs are involved? The process is important—the ethics by which decisions are made and the transparency involved. I cannot see that it is simply about whether the outcome is such and such; it is about the way in which people make decisions. I know that people like talking about outcome-based regulation, but it is certainly not the only important aspect of regulation.
On the issue of removing prescriptiveness, I am in favour of ethical prescriptiveness, so I cannot see that the Minister has made a particularly good case for the changes made under Clause 14. He talked about having access to safeguards when they matter most. It would be far preferable to have rights that can be exercised in the face of automated decision-making, in particular workplace protection. At various points during the debates on the Bill we have touched on things such as algorithmic impact assessment in the workplace and no doubt we will touch on it further. That is of great and growing importance, but again there is no recognition of that.
I am afraid that the Minister has not made a fantastic case for keeping Clause 14 and I think that most of us will want to kick the tyres and carry on interrogating whether it should be part of the Bill. In the meantime, I beg leave to withdraw Amendment 53.
My Lords, the Central Digital and Data Office, or CDDO, and the Centre for Data Ethics and Innovation, as it was then called—it now has a new name as a unit of DSIT—launched the algorithmic transparency recording standard in November 2021. The idea for the ATRS arose from a recommendation by the CDEI that the UK Government should place a mandatory transparency obligation on public sector organisations using algorithms to support “significant decisions affecting individuals”. It is intended to help public sector organisations to provide clear information about the algorithmic tools that they use, how they operate and why they are using them.
The ATRS is a promising initiative that could go some way to addressing the current transparency deficit around the use of algorithmic and AI tools by public authorities. Organisations are encouraged to submit reports about each algorithmic tool that they are using that falls within the scope of the standard.
We welcome the recent commitments made in the Government’s response to the AI regulation White Paper consultation to make the ATRS a requirement for all government departments. However, we believe that this is an opportunity to deliver on this commitment through the DPDI Bill, by placing it on a statutory footing rather than it being limited to a requirement in guidance. That is what Amendment 74 is designed to do.
We also propose another new clause that should reflect the Government’s commitment to algorithmic transparency. It would require the Secretary of State to introduce a compulsory transparency reporting requirement, but only when she or he considers it appropriate to do so. It is a slight watering-down of Amendment 74, but it is designed to tempt the Minister into further indiscretions. In support of transparency, the new clause would, for as long as the Secretary of State considers making the ATRS compulsorily inappropriate, also require the Secretary of State to regularly explain why and keep her decision under continual review.
Amendment 76 on safe and responsible automated decision systems proposes a new clause that seeks to shift the burden back on public sector actors. It puts the onus on them to ensure safety and prevent harm, rather than waiting for harm to occur and putting the burden on individuals to challenge it. It imposes a proactive statutory duty, similar to the public sector equality duty under Section 149 of the Equality Act 2010, to have “due regard” to ensuring that
“automated decision systems … are responsible and minimise harm to individuals and society at large”.
The duty incorporates the key principles in the Government’s AI White Paper and therefore is consistent with its substantive approach. It also includes duties to be proportionate, to give effect to individuals’ human rights and freedoms and to safeguard democracy and the rule of law. It applies to all “automated decision systems”. These are
“any tool, model, software, system, process, function, program, method and/or formula designed with or using computation to automate, analyse, aid, augment, and/or replace human decisions that impact the welfare, rights and freedoms of individuals”.
This therefore applies to partly automated decisions, as well as those that are entirely automated, and systems in which multiple automated decision processes take place.
It applies to traditional public sector actors: public authorities, or those exercising public functions, including private actors outsourced by the Government to do so; those that may exercise control over automated decision systems, including regulators; as well as those using data collected or held by a public authority, which may be public or private actors. It then provides one mandatory mechanism through which compliance with the duty must be achieved—impact assessments. We had a small debate about the ATRS and whether a compliance system was in place. It would be useful to see whether the Minister has any further comment on that, but I think that he disagreed with my characterisation that there is no compliance system currently.
This provision proposes impact assessments. The term used, “algorithmic impact assessment”, is adopted from Canada’s analogous directive on automated decision-making, which mandates the use of AIAs for all public sector automated decision systems. The obligation is on the Secretary of State, via regulations, to set out a framework for AIAs, which would help actors to uphold their duty to ensure that automated decision systems are responsible and safe; to understand and to reduce the risks in a proactive and ongoing way; to introduce the appropriate governance, oversight, reporting and auditing requirements; and to communicate in a transparent and accessible way to affected individuals and the wider public.
Amendment 252 would require a list of UK addresses to be made freely available for reuse. Addresses have been identified as a fundamental geospatial dataset by the UN and a high-value dataset by the EU. Address data is used by tens of thousands of UK businesses, including for delivery services and navigation software. Crucially, address data can join together different property-related data, such as energy performance certificates or Land Registry records, without using personal information. This increases the value of other high-value public data.
I feel under amazing pressure to get the names right, especially given the number of hours we spend together.
I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for tabling Amendments 74 to 78, 144 and 252 in this group. I also extend my thanks to noble Lords who have signed the amendments and spoken so eloquently in this debate.
Amendments 74 to 78 would place a legislative obligation on public authorities and all persons in the exercise of a public function to publish reports under the Algorithmic Transparency Recording Standard—ATRS—or to publish algorithmic impact assessments. These would provide information on algorithmic tools and algorithm-assisted decisions that process personal data in the exercise of a public function or those that have a direct or indirect public effect or directly interact with the general public. I remind noble Lords that the UK’s data protection laws will continue to apply throughout the processing of personal data.
The Government are already taking action to establish the necessary guard-rails for AI, including to promote transparency. In the AI regulation White Paper response, we announced that the use of the ATRS will now become a requirement for all government departments and the broader public sector. The Government are phasing this in as we speak and will check compliance accordingly, as DSIT has been in contact with every department on this issue.
In making this policy, the Government are taking an approach that provides increasing degrees of mandation of the ATRS, with appropriate exemptions, allowing them to monitor compliance and effectiveness. The announcement in the White Paper response has already led to more engagement from across government, and more records are under way. The existing process focuses on the importance of continuous improvement and development. Enshrining the standard into law prematurely, amid exponential technological change, could hinder its adaptability.
More broadly, our AI White Paper outlined a proportionate and adaptable framework for regulating AI. As part of that, we expect AI development and use to be fair, transparent and secure. We set out five key principles for UK regulators to interpret and apply within their remits. This approach reflects the fact that AI systems are not unregulated and need to be compliant with existing regulatory frameworks, including employment, human rights, health and safety and data protection law.
For instance, the UK’s data protection legislation imposes obligations on data controllers, including providers and users of AI systems, to process personal data fairly, lawfully and transparently. Our reforms in this Bill will ensure that, where solely automated decision-making is undertaken—that is, ADM without any meaningful human involvement that has significant effects on data subjects—data subjects will have a right to the relevant safeguards. These safeguards include being provided with information on the ADM that has been carried out and the right to contest those decisions and seek human review, enabling controllers to take suitable measures to correct those that have produced wrongful outcomes.
My Lords, I wonder whether the Minister can comment on this; he can write if he needs to. Is he saying that, in effect, the ATRS is giving the citizen greater rights than are ordinarily available under Article 22? Is that the actual outcome? If, for instance, every government department adopted ATRS, would that, in practice, give citizens a greater degree of what he might put as safeguards but, in this context, he is describing as rights?
I am very happy to write to the noble Lord, but I do not believe that the existence of an ATRS-generated report in and of itself confers more rights on anybody. Rather, it makes it easier for citizens to understand how their rights are being used, what rights they have, or what data about them is being used by the department concerned. The existence of data does not in and of itself confer new rights on anybody.
I understand that, but if he rewinds the reel he will find that he was talking about the citizen’s right of access, or something of that sort, at that point. Once you know what data is being used, the citizen has certain rights. I do not know whether that follows from the ATRS or he was just describing that at large.
As I said, I will write. I do not believe that follows axiomatically from the ATRS’s existence.
On Amendment 144, the Government are sympathetic to the idea that the ICO should respond to new and emerging technologies, including the use of children’s data in the development of AI. I assure noble Lords that this area will continue to be a focus of the ICO’s work and that it already has extensive powers to provide additional guidance or make updates to the age-appropriate design code, to ensure that it reflects new developments, and a responsibility to keep it up to date. The ICO has a public task under Article 57(1)(b) of the UK GDPR to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
It is already explicit that:
“Activities addressed specifically to children shall receive specific attention”.
That code already includes a chapter on profiling and provides guidance on fairness and transparency requirements around automated decision-making.
Taking the specific point made by the noble Baroness, Lady Kidron, on the contents of the ICO’s guidance, while I cannot speak to the ICO’s decisions about the drafting of its guidance, I am content to undertake to speak to it about this issue. I note that it is important to be careful to avoid a requirement for the ICO to duplicate work. The creation of an additional children’s code focused on AI could risk fragmenting approaches to children’s protections in the existing AADC—a point made by the noble Baroness and by my noble friend Lady Harding.
We have some numbers that I will come to, but I am very happy to share deeper analysis of that with all noble Lords.
There is also free access to this data for developers to innovate in the market. The Government also make this data available for free at the point of use to more than 6,000 public sector organisations, as well as postcode, unique identifier and location data available under open terms. The Government explored opening address data in 2016. At that time, it became clear that the Government would have to pay to make this data available openly or to recreate it. That was previously attempted, and the resulting dataset had, I am afraid, critical quality issues. As such, it was determined at that time that the changes would result in significant additional cost to taxpayers and represent low value for money, given the current widespread accessibility of the data. For the reasons I have set out, I hope that the noble Lords will withdraw their amendments.
My Lords, I thank the Minister for his response. There are a number of different elements to this group.
The one bright spot in the White Paper consultation is the ATRS. That was what the initial amendments in this group were designed to give a fair wind to. As the noble Lord, Lord Bassam, said, this is designed to assist in the adoption of the ATRS, and I am grateful for his support on that.
My Lords, I will speak to almost all the amendments in this group, other than those proposed by the noble Baroness, Lady Kidron. I am afraid that this is a huge group; we probably should have split it to have a better debate, but that is history.
I very much support what the noble Baroness said about her amendments, particularly Amendment 79. The mandation of ethics by design is absolutely crucial. There are standards from organisations such as the IEEE for that kind of ethics by design in AI systems. I believe that it is possible to do exactly what she suggested, and we should incorporate that into the Bill. It illustrates that process is as important as outcomes. We are getting to a kind of philosophical approach here, which illustrates the differences between how some of us and the Government are approaching these things. How you do something, the way you design it and the fact that it needs to be ethical is absolutely cardinal in any discussion—particularly about artificial intelligence. I do not think that it is good enough simply to talk about the results of what AI does without examining how it does it.
Having said that, I turn to Amendment 80 and the Clause 16 stand part notice. Under Clause 16, the Government are proposing to remove Article 27 of the UK GDPR without any replacement. By removing the legal requirement on non-UK companies to retain a UK representative, the Government would deprive individuals of a local, accessible point of contact through which people can make data protection rights requests. That decision threatens people’s capacity to exercise their rights, reducing their ability to remain in control of their personal information.
The Government say that removing Article 27 will boost trade with the UK by reducing the compliance burden on non-UK businesses. But they have produced little evidence to support the notion that this will be the case and have overlooked the benefits in operational efficiency and cost savings that the representative can bring to non-UK companies. Even more worryingly, the Government appear to have made no assessment of the impact of the change on UK individuals, in particular vulnerable groups such as children. It is an ill-considered policy decision that would see the UK take a backward step in regulation at a time when numerous other jurisdictions, such as Switzerland, Turkey, South Korea, China and Thailand, are choosing to safeguard the extraterritorial application of their data protection regimes through the implementation of the legal requirement to appoint a representative.
The UK representative ensures that anyone in the UK wishing to make a privacy-related request has a local, accessible point of contact through which to do so. The representative plays a critical role in helping people to access non-UK companies and hold them accountable for the processing of their data. The representative further provides a direct link between the ICO and non-UK companies to enable the ICO to enforce the UK data protection regime against organisations outside the UK.
On the trade issue, the Government argue that by eliminating the cost of retaining a UK representative, non-UK companies will be more inclined to offer goods and services to individuals in the UK. Although there is undeniably a cost to non-UK companies of retaining a representative, the costs are significantly lower than the rather disproportionately inflated figures that were cited in the original impact assessment, which in some cases were up to 10 times the average market rate for representative services. The Government have put forward very little evidence to support the notion that removing Article 27 will boost trade with the UK.
There is an alternative approach. Currently, the Article 27 requirement to appoint a UK representative applies to data controllers and processors. An alternative approach to the removal of Article 27 in its entirety would be to retain the requirement but limit its scope so that it applies only to controllers. Along with the existing exemption at Article 27(2), this would reduce the number of non-UK companies required to appoint a representative, while arguably still preserving a local point of contact through which individuals in the UK can exercise their rights, as it is data controllers that are obliged under Articles 15 to 22 of the UK GDPR to respond to data subject access requests. That is a middle way that the Government could adopt.
Moving to Amendment 82, at present, the roles of senior responsible individual in the Bill and data protection officer under the EU GDPR appear to be incompatible. That is because the SRI is part of the organisation’s senior management, whereas a DPO must be independent of an organisation’s senior management. This puts organisations caught by both the EU GDPR and the UK GDPR in an impossible situation. At the very least, the Government must explain how they consider that these organisations can comply with both regimes in respect of the SRI and DPO provisions.
The idea of getting rid of the DPO runs completely contrary to the way in which we need to think about accountability for AI systems. We need senior management who understand the corporate significance of the AI systems they are adopting within the business. The ideal way forward would be for the DPO to be responsible for that when AI regulation comes in, but the Government seem to be completely oblivious to that. Again, it is highly frustrating for those of us who thought we had a pretty decent data protection regime to find this kind of watering down taking place in the face of the risks from artificial intelligence that are becoming more and more apparent as the days go by. I firmly believe that it will inhibit the application and adoption of AI within businesses if we do not have public trust and business certainty.
I now come to oppose the question that Clause 18, on the duty to keep records, stand part of the Bill. This clause seems to masquerade as an attempt to get rid of red tape. In reality, it makes organisations less likely to be compliant with the main obligations in the UK GDPR, as it will be amended by the Bill, and therefore heightens the risk both to the data subjects whose data they hold and to the organisations in terms of non-compliance. This is, of course, the duty to keep records. It is particularly unfair on small businesses that do not have the resources to take advice on these matters. Records of processing activities are one of the main ways in which organisations can meet the requirements of Article 5(2) of the UK GDPR to demonstrate their compliance. The obligation to demonstrate compliance remains unaltered under the Bill. Therefore, dispensing with the main way of achieving compliance with Article 5(2) is impractical and unhelpful.
At this point, I should say that we support Amendment 81 in the name of the noble Baroness, Lady Jones, which concerns the assessment of high-risk processing.
Our amendments on data protection impact assessments are Amendments 87, 88 and 89. Such assessments are currently required under Article 35 of the UK GDPR and are essential to ensuring that organisations do not deploy, and individuals are not subjected to, systems that may lead to unlawful, rights-violating or discriminatory outcomes. The Government’s data consultation response noted:
“The majority of respondents agreed that data protection impact assessments requirements are helpful in identifying and mitigating risk, and disagreed with the proposal to remove the requirement to undertake data protection impact assessments”.
However, under Clause 20, the requirement to perform an impact assessment would be seriously diluted. That is all I need to say. The Government frequently pray in aid the consultation—they say, “Well, we did that because of the consultation”—so why are they flying in the face of it? That seems an extraordinary thing to do in circumstances where impact assessments are regarded as a useful tool and training by business has clearly adjusted to them over the years since the Data Protection Act 2018.
My Lords, I rise to speak in support Amendments 79, 83, 85, 86, 93, 96, 97, 105 and 107, to which I have added my name. An awful lot has already been said. Given the hour of the day, I will try to be brief, but I want to speak to the child amendments I have put my name to and to the non-child ones and to raise things up a level.
The noble Lord, Lord Clement-Jones, talked about trust. I have spent the best part of the past 15 years running consumer and citizen digitally enabled services. The benefit that technology brings to life is clear to me but—this is a really important “but”—our customers and citizens need to trust what we do with their data, so establishing trust is really important.
One the bedrock of that trust is forcing—as a non-technologist, I use that word advisedly—technologists to set out what they are trying to do, what the technology they propose to build will do and what the risks and opportunities of that technology are. My experience as a non-engineer is that when you put engineers under pressure, they can speak English, but it is not their preferred language. They do not find it easy to articulate the risks and opportunities of the technology they are building, which is why forcing businesses that build these services to set out in advance the data protection impacts of the services they are building is so important. It is also why you have to design with safety in mind upfront because technology is so hard to retrofit. If you do not design it up front with ethics and safety at its core, it is gone by the time you see the impact in the real world.
I thank the noble Baronesses, Lady Kidron and Lady Jones, and the noble Lord, Lord Clement-Jones, for their amendments, and I look forward to receiving the letter from the noble Baroness, Lady Kidron, which I will respond to as quickly as I can. As everybody observed, this is a huge group, and it has been very difficult for everybody to do justice to all the points. I shall do my best, but these are points that go to the heart of the changes we are making. I am very happy to continue engaging on that basis, because we need plenty of time to review them—but, that said, off we go.
The changes the Government are making to the accountability obligations are intended to make the law clearer and less prescriptive. They will enable organisations to focus on areas that pose high risks to people resulting, the Government believe, in improved outcomes. The new provisions on assessments of high-risk processing are less prescriptive about the precise circumstances in which a risk assessment would be required, as we think organisations are best placed to judge whether a particular activity poses a high risk to individuals in the context of the situation.
However, the Government are still committed to high standards of data protection, and there are many similarities between our new risk assessment measures and the previous provisions. When an organisation is carrying out processing activities that are likely to pose a high risk to individuals, it will still be expected to document that processing, assess risks and identify mitigations. As before, no such document would be required where organisations are carrying out low-risk processing activities.
One of the main aims of the Bill is to remove some of the UK GDPR’s unnecessary compliance burdens. That is why organisations will be required to designate senior responsible individuals, keep records of processing and carry out the risk assessments above only when their activities pose high risks to individuals.
The noble Viscount is very interestingly unpacking a risk-based approach to data protection under the Bill. Why are the Government not taking a risk-based approach to their AI regulation? After all, the AI Act approaches it in exactly that way.
I will briefly address it now. Based on that letter, the Government’s view is to avoid prescription and I believe that the ICO’s view— I cannot speak for it—is generally the same, except for a few examples where prescription needs to be specified in the Bill. I will continue to engage with the ICO on where exactly to draw that line.
My Lords, I can see that there is a difference of opinion, but it is unusual for a regulator to go into print with it. Not only that, but he has set it all out in an annexe. What discussion is taking place directly between the Minister and his team and the ICO? There seems to be quite a gulf between them. This is number 1 among his “areas of ongoing concern”.
I do not know whether it is usual or unusual for the regulator to engage in this way, but the Bill team engages with the Information Commissioner frequently and regularly, and, needless to say, it will continue to do so on this and other matters.
Children need particular protection when organisations are collecting and processing their personal data, because they may be less aware of the risks involved. If organisations process children’s personal data, they should think about the need to protect them from the outset and design their systems and processes with this in mind.
Before I turn to the substance of what the Bill does with the provisions on high-risk processing, I will deal with the first amendment in this group: Amendment 79. It would require data processors to consider data protection-by-design requirements in the same way that data controllers do, because there is a concern that controllers may not always be able to foresee what processors do with people’s data for services such as AI and cloud computing.
However, under the current legislation, it should not be for the processor to determine the nature or purposes of the processing activity, as it will enter a binding controller-processor agreement or contract to deliver a specific task. Processors also have specific duties under the UK GDPR to keep personal data safe and secure, which should mean that this amendment is not necessary.
I turn to the Clause 16 stand part notice, which seeks to remove Clause 16 from the Bill and reinstate Article 27, and Amendment 80, which seeks to do the same but just in respect of overseas data controllers, not processors. I assure the noble Lord, Lord Clement-Jones, that, even without the Article 27 representative requirement, controllers and processors will still have to maintain contact and co-operation with UK data subjects and the ICO to comply with the UK GDPR provisions. These include Articles 12 to 14, which, taken together, require controllers to provide their contact details in a concise, transparent, intelligible and easily accessible form, using clear and plain language, particularly for any information addressed specifically to a child.
By offering firms a choice on whether to appoint a representative in the UK to help them with UK GDPR compliance and no longer mandating organisations to appoint a representative, we are allowing organisations to decide for themselves the best way to comply with the existing requirements for effective communication and co-operation. Removing the representative requirement will also reduce unnecessary burdens on non-UK controllers and processors while maintaining data subjects’ safeguards and rights. Any costs associated with appointing a representative are a burden on and a barrier to trade. Although the variety of packages made available by representative provider organisations differ, our assessments show that the cost of appointing representatives increases with the size of a firm. Furthermore, there are several jurisdictions that do not have a mandatory or equivalent representative requirement in their data protection law, including other countries in receipt of EU data adequacy decisions.
Nevertheless, does the Minister accept that quite a lot of countries have now begun the process of requiring representatives to be appointed? How does he account for that? Does he accept that what the Government are doing is placing the interests of business over those of data subjects in this context?
No, I do not accept that at all. I would suggest that we are saying to businesses, “You must provide access to the ICO and data subjects in a way that is usable by all parties, but you must do so in the manner that makes the most sense to you”. That is a good example of going after outcomes but not insisting on any particular process or methodology in a one-size-fits-all way.
Yes—if the person they were supposed to communicate with did not speak English or was not available during reasonable hours, that would be in violation of the requirement.
I apologise if we briefly revisit some of our earlier discussion here, but Amendment 81 would reintroduce a list of high-risk processing activities drawn from Article 35 of the UK GDPR, with a view to helping data controllers comply with the new requirements around designating a senior responsible individual.
The Government have consulted closely with the ICO throughout the development of all the provisions in the Bill, and we welcome its feedback as it upholds data subjects’ rights. We recognise and respect that the ICO’s view on this issue is different to the Government’s, but the Government feel that adding a prescriptive list to the legislation would not be appropriate for the reasons we have discussed. However, as I say, we will continue to engage with it over the course of the passage of the Bill.
Some of the language in Article 35 of the UK GDPR is unclear and confusing, which is partly why we removed it in the first place. We believe organisations should have the ability to make a judgment of risk based on the specific nature, scale and context of their own processing activities. We do not need to provide prescriptive examples of high-risk processing on the face of legislation because any list could quickly become out of date. Instead, to help data controllers, Clause 20 requires the ICO to produce a document with examples of what the commissioner considers to be high-risk processing activities.
I turn to Clause 17 and Amendment 82. The changes we are making in the Bill will reduce prescription by removing the requirement to appoint a data protection officer in certain circumstances. Instead, public bodies and other organisations carrying out high-risk processing activities will have to designate a senior responsible individual to ensure that data protection risks are managed effectively within their organisations. That person will have flexibility about how they manage data protection risks. They might decide to delegate tasks to independent data protection experts or upskill existing staff members, but they will not be forced to appoint data protection officers if suitable alternatives are available.
The primary rationale for moving to a senior responsible individual model is to embed data protection at the heart of an organisation by ensuring that someone in senior management takes responsibility and accountability for it if the organisation is a public body or is carrying out high-risk processing. If organisations have already appointed data protection officers and want to keep an independent expert to advise them, they will be free to do so, providing that they also designate a senior manager to take overall accountability and provide sufficient support, including resources.
Amendment 83, tabled by the noble Baroness, Lady Kidron, would require the senior responsible individual to specifically consider the risks to children when advising the controller on its responsibilities. As drafted, Clause 17 of the Bill requires the senior responsible individual to perform a number of tasks or, if they cannot do so themselves, to make sure that they are performed by another person. They include monitoring the controller’s compliance with the legislation, advising the controller of its obligations and organising relevant training for employees who carry out the processing of personal data. Where the organisation is processing children’s data, all these requirements will be relevant. The senior responsible individual will need to make sure that any guidance and training reflects the type of data being processed and any specific obligations the controller has in respect of that data. I hope that this goes some way to convincing the noble Baroness not to press her amendment.
The Minister has not really explained the reason for the switch from the DPO to the new system. Is it another one of his “We don’t want a one-size-fits-all approach” arguments? What is the underlying rationale for it? Looking at compliance costs, which the Government seem to be very keen on, we will potentially have a whole new cadre of people who will need to be trained in compliance requirements.
The data protection officer—I speak as a recovering data protection officer—is tasked with certain specific outcomes but does not necessarily have to be a senior person within the organisation. Indeed, in many cases, they can be an external adviser to the organisation. On the other hand, the senior responsible individual is a senior or board-level representative within the organisation and can take overall accountability for data privacy and data protection for that organisation. Once that accountable person is appointed, he or she can of course appoint a DPO or equivalent role or separate the role among other people as they see fit. That gives everybody the flexibility to meet the needs of privacy as they see fit, but not necessarily in a one-size-fits-all way. That is the philosophical approach.
Does the Minister accept that the SRI will have to cope with having at least a glimmering of an understanding of what will be a rather large Act?
Yes, the SRI will absolutely have to understand all the organisation’s obligations under this Act and indeed other Acts. As with any senior person in any organisation responsible for compliance, they will need to understand the laws that they are complying with.
Amendment 84, tabled by the noble Lord, Lord Clement-Jones, is about the advice given to senior responsible individuals by the ICO. We believe that the commissioner should have full discretion to enforce data protection in an independent, flexible, risk-based and proportionate manner. The amendment would tie the hands of the regulator and force them to give binding advice and proactive assurance without full knowledge of the facts, undermining their regulatory enforcement role.
As long as that applies to us on occasion as well.
My Lords, just in passing, I will say that I am beginning to feel that the decision made by the Privileges Committee and now the House is beginning to creak in terms of the very first Grand Committee that it has encountered. So, in terms of time limits, I think flexibility on Grand Committee in particular is absolutely crucial. I am afraid that the current procedures will not necessarily stand the test of time—but we shall see.
This is a relatively short debate on whether Clause 19 should stand part, but it is a really significant clause, and it is another non-trust-engendering provision. This basically takes away the duty of the police to provide justification for why they are consulting or sharing personal data. Prompted by the National AIDS Trust, we believe that the Bill must retain the duty on police forces to justify why they have accessed an individual’s personal data.
This clause removes an important check on police processing of an individual’s personal data. The NAT has been involved in cases of people living with HIV whose HIV status was shared without their consent by police officers, both internally within their police station and within the wider communities that they serve. Therefore, ensuring that police officers justify why they have accessed an individual’s personal data is vital evidence in cases of police misconduct. Such cases include when a person’s HIV status is shared inappropriately by the police, or when it is not relevant to an investigation of criminal activity.
The noble Baroness, Lady Kidron, was extremely eloquent in her winding up of the last group. The Minister really needs to come back and tell us what on earth the motivation is behind this particular Clause 19. I beg to move that this clause should not stand part of the Bill.
This is a mercifully short group on this occasion. I thank the noble Lord, Lord Clement-Jones, for the amendment, which seeks to remove Clause 19 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record when personal data has been accessed and why. Clause 19 does not remove the need for police to justify their processing; it simply removes the ineffective administrative requirement to record that justification in a log.
The justification entry was intended to help to monitor and detect unlawful access. However, the reality is that anyone accessing data unlawfully is very unlikely to record an honest justification, making this in practice an unreliable means of monitoring misconduct or unlawful processing. Records of when data was accessed and by whom can be automatically captured and will remain, thereby continuing to ensure accountability.
In addition, the National Police Chiefs’ Council’s view is that this change will not hamper any investigations to identify the unlawful processing of data. That is because it is unlikely that an individual accessing data unlawfully would enter an honest justification, so capturing this information is unlikely to be useful in any investigation into misconduct. The requirements to record the time, date and, as far as possible, the identity of the person accessing the data will remain, as will the obligation that there is lawful reason for the access, ensuring that accountability and protection for data subjects is maintained.
Police officers inform us that the current requirement places an unnecessary burden on them as they have to update the log manually. The Government estimate that the clause could save approximately 1.5 million policing hours, representing a saving in the region of £46.5 million per year.
I understand that the amendment relates to representations made by the National AIDS Trust concerning the level of protection for people’s HIV status. As I believe I said on Monday, the Government agree that the protection of people’s HIV status is vital. We have met the National AIDS Trust to discuss the best solutions to the problems it has raised. For these reasons, I hope the noble Lord will not oppose Clause 19 standing part.
I thank the Minister for his response, but he has left us tantalised about the outcome of his meeting. What is the solution that he has suggested? We are none the wiser as a result of his response.
This pudding has been well over-egged by the National Police Chiefs’ Council. Already, only certain senior officers and the data protection leads in police forces have access to this functionality. There will continue to be a legal requirement to record the time and date of access. They are required to follow a College of Policing code of practice. Is the Minister really saying that recording a justification for accessing personal data is such an onerous requirement that £46.5 million in police time will be saved as a result of this? Over what period? That sounds completely disproportionate.
The fact is that the recording of the justification, whether or not it is false and cannot be relied upon as evidence, is rather useful because it is evidence of police misconduct in relation to inappropriately accessing personal data. They are actually saying: “We did it for this purpose”, when it clearly was not. I am not at all surprised that the National AIDS Trust is worried about this. The College of Policing code of practice does not mention logging requirements in detail. It references them just once in relation to automated systems that process data.
I am extremely grateful to the noble Lord, Lord Bassam, for what he had to say. It seems to me that we do not have any confidence on this side of the House that removing this requirement provides enough security that officers will be held to account if they share an individual’s special category data inappropriately. I do not think the Minister has really answered the concerns, but I beg leave to withdraw my objection to the clause standing part.
My Lords, given the hour, I will be brief. That was an absolute tour de force by the noble Baroness. As with all the Minister’s speeches, I will read her speech over Easter.
I was very interested to be reminded of the history of Napster, because that was when many of us realised that we were, in many ways, entering the digital age in the creative industries and beyond. The amendments that the noble Baroness put forward are examples of where the Bill could make a positive impact, unlike the impact that so much of the rest of it is making in watering down rights. She described cogently how large language models are ingesting or scraping data from the internet, social media and journalism, how very close to the ingestion of copyright material this whole agenda is and how it is being done by anonymous bots in particular. It fits very well with the debate in which the Minister was involved last Friday on the Private Member’s Bill of the noble Lord, Lord Holmes, who inserted a clause requiring transparency on the ingestion or scraping of data and copyright material by large language models. It is very interesting.
The opportunity in the data area is currently much greater than it is in the intellectual property area. At least we have the ICO, which is a regulator, unlike the IPO, which is not really a regulator with teeth. I am very interested in the fact that the ICO is conducting a consultation on generative AI and data protection, which it launched in January. Conterminously with this Bill, perhaps the ICO might come to some conclusions that we can use. That would of course include the whole area of biometrics, which, in the light of things such as deepfakes and so on, is increasingly an issue of great concern. The watchword is “transparency”: we must impose a duty on the generative AI models about the use of the material that they use to train their models and then use in operation. I fully support Amendments 103 and 104 in the name of the noble Baroness, even though, as she describes them, they are a small step.
My Lords, I, too, will be relatively brief. I thank the noble Baroness, Lady Kidron, for her amendments, to which I was very pleased to add my name. She raised an important point about the practice of web scrapers, who take data from a variety of sources to construct large language models without the knowledge or permission of web owners and data subjects. This is a huge issue that should have been a much more central focus of the Bill. Like the noble Baroness, I am sorry that the Government did not see fit to use the Bill to bring in some controls on this increasingly prevalent practice, because that would have been a more constructive use of our time than debating the many unnecessary changes that we have been debating so far.
As the noble Baroness said, large language models are built on capturing text, data and images from infinite sources without the permission of the original creator of the material. As she also said, it is making a mockery of our existing data rights. It raises issues around copyright and intellectual property, and around personal information that is provided for one purpose and commandeered by web scrapers for another. That process often happens in the shadows, whereby the owner of the information finds out only much later that their content has been repurposed.
What is worse is that the application of AI means that material provided in good faith can be distorted or corrupted by the bots scraping the internet. The current generation of LLMs are notorious for hallucinations in which good quality research or journalistic copy is misrepresented or misquoted in its new incarnation. There are also numerous examples of bias creeping into the LLM output, which includes personal data. As the noble Baroness rightly said, the casual scraping of children’s images and data is undermining the very essence of our existing data protection legislation.
It is welcome that the Information Commissioner has intervened on this. He argued that LLMs should be compliant with the Data Protection Act and should evidence how they are complying with their legal obligations. This includes individuals being able to exercise their information rights. Currently, we are a long way from that being a reality and a practice. This is about enforcement as much as giving guidance.
I am pleased that the noble Baroness tabled these amendments. They raise important issues about individuals giving prior permission for their data to be used unless there is an easily accessible opt-out mechanism. I would like to know what the Minister thinks about all this. Does he think that the current legislation is sufficient to regulate the rise of LLMs? If it is not, what are the Government doing to address the increasingly widespread concerns about the legitimacy of web scraping? Have the Government considered using the Bill to introduce additional powers to protect against the misuse of personal and creative output?
In the meantime, does the Minister accept the amendments in the name of the noble Baroness, Lady Kidron? As we have said, they are only a small part of a much bigger problem, but they are a helpful initiative to build in some basic protections in the use of personal data. This is a real challenge to the Government to step up to the mark and be seen to address these important issues. I hope the Minister will say that he is happy to work with the noble Baroness and others to take these issues forward. We would be doing a good service to data citizens around the country if we did so.
My Lords, UK law enforcement authorities processing personal data for law enforcement purposes currently use internationally based companies for data processing services, including cloud storage. The use of international processors is critical for modern organisations and law enforcement is no exception. The use of these international processors enhances law enforcement capabilities and underpins day-to-day functions.
Transfers from a UK law enforcement authority to an international processor are currently permissible under the Data Protection Act 2018. However, there is currently no bespoke mechanism for these transfers in Part 3, which has led to confusion and ambiguity as to how law enforcement authorities should approach the use of such processors. The aim of this amendment is to provide legal certainty to law enforcement authorities in the UK, as well as transparency to the public, so that they can use internationally based processors with confidence.
I have therefore tabled Amendments 110, 117 to 120, 122 to 129 and 131 to provide a clear, bespoke mechanism in Part 3 of the Data Protection Act 2018 for UK law enforcement authorities to use when transferring data to their contracted processors based outside the UK. This will bring Part 3 into line with the UK GDPR while clarifying the current law, and give UK law enforcement authorities greater confidence when making such transfers to their contracted processors for law enforcement purposes.
We have amended Section 73—the general principles for transfer—to include a specific reference to processors, ensuring that international processors can be a recipient of data transfers. In doing so, we have ensured that the safeguards within Chapter 5 that UK law enforcement authorities routinely apply to transfers of data to their international operational equivalents are equally applicable to transfers to processors. We are keeping open all the transfer mechanisms so that data can be transferred on the basis of an applicable adequacy regulation, the appropriate safeguards or potentially the special circumstances.
We have further amended Section 75—the appropriate safeguards provision—to include a power for the ICO to create, specifically for Part 3, an international data transfer agreement, or IDTA, to complement the IDTA which it has already produced to facilitate transfers using Article 46(2)(d) of the UK GDPR.
In respect of transfers to processors, we have disapplied the duty to inform the Information Commissioner about international transfers made subject to appropriate safeguards. As such, a requirement would be out of line with equivalent provisions in the UK GDPR. There is no strong rationale for complying with the provision, given that processors are limited in what they can do with data because of the nature of their contracts and that it would be unlikely to contribute to the effective functioning of the ICO.
Likewise, we have also disapplied the duty to document such transfers and to provide the documentation to the commissioner on request. This is because extending these provisions would duplicate requirements that already exist elsewhere in legislation, including in Section 61, which has extensive recording requirements that enable full accountability to the ICO.
We have also disapplied the majority of Section 78. While it provides a useful function in the context of UK law enforcement authorities transferring to their international operational equivalents, in the law enforcement to international processor context it is not appropriate because processors cannot decide to transfer data onwards on their own volition. They can only do so under instruction from the UK law enforcement authority controller.
Instead, we have retained the general prohibition on any further transfers to processors based in a separate third country by requiring UK law enforcement authority controllers to make it a condition of a transfer to its processor that data is only to be further transferred in line with the terms of the contract with or authorisation given by the controller, and where the further transfer is permitted under Section 73. We have also taken the opportunity to tidy up Section 77 which governs transfers to non-relevant authorities, relevant international organisations or international processors.
In respect of Amendment 121, tabled by the noble Lord, Lord Clement-Jones, on consultation with the Information Commissioner, I reassure the noble Lord that there is a memorandum of understanding between the Home Office and the Information Commissioner regarding international transfers approved by regulations, which sets out the role and responsibilities of the ICO. As part of this, the Home Office consults the Information Commissioner at various stages in the process. The commissioner, in turn, provides independent assurance and advice on the process followed and on the factors taken into consideration.
I understand that this amendment also relates to representations made by the National AIDS Trust. Perhaps the simplest thing is merely to reference my earlier remarks and commitment to engage with the National AIDS Trust ongoing. I beg to move that the government amendments which lead this group stand part of the Bill.
My Lords, very briefly, I thank the Minister for unpacking his amendments with some care, and for giving me the answer to my amendment before I spoke to it—that saves time.
Obviously, we all understand the importance of transfers of personal data between law enforcement authorities, but perhaps the crux of this, and the one question in our mind is, what is—perhaps the Minister could remind us—the process for making sure that the country that we are sending it to is data adequate? Amendment 121 was tabled as a way of probing that. It would be extremely useful if the Minister can answer that. This should apply to transfers between law enforcement authorities just as much as it does for other, more general transfers under Schedule 5. If the Minister can give me the answer, that would be useful, but if he does not have the answer to hand, I am very happy to suspend my curiosity until after Easter.
My Lords, I too can be brief, having heard the Minister’s response. I thought he half-shot the Clement-Jones fox, with very good aim on the Minister’s part.
I was simply going to say that it is one in a sea of amendments from the Government, but the noble Lord, Lord Clement-Jones, made an important point about making sure that the country organisations that the commissioner looks at should meet the test of data adequacy—I also had that in my speaking note. The noble Lord, Lord Clement-Jones, was making a good point in terms of ensuring that appropriate data protections are in place internationally for us to be able to work with.
The Minister explained the government amendments with some care, but I wonder if he could explain how data transfers are made to an overseas processor using the powers relied on by reference to new Section 73(4)(aa) of the 2018 Act. The power is used as a condition and justification for several of the noble Lord’s amendments, and I wonder whether he has had to table these amendments because of the original drafting. That would seem to be to be the most likely reason.