Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, once more into the trenches we go before Easter. In moving Amendment 53, I will also speak to Amendments 54, 55, 57, 69, 70, 71 and 72 and the Clause 14 stand part notice.

The Bill contains a number of wide delegated powers, giving the Secretary of State the power to amend the UK GDPR via statutory instrument. The Government have said that the UK GDPR’s key elements remain sound and that they want to continue to offer a high level of protection for the public’s data, but that is no guarantee against significant reforms being brought in through a process that eludes full parliamentary scrutiny through primary legislation. Proposed changes to the UK GDPR should be contained in the Bill, where they can be debated and scrutinised properly via the primary legislation process. As it stands, key provisions of the UK GDPR can subsequently be amended via statutory instrument, which, in this case, is an inappropriate legislative process that affords much less scrutiny and debate, if debates are held at all.

The UK GDPR treats a solely automated decision as one without “meaningful human involvement”. The public are protected from being subject to solely automated decision-making where the decision has a legal or “similarly significant effect”. Clause 14(1) inserts new Article 22D(1) into the UK GDPR, which allows the Secretary of State to make regulations that deem a decision to have involved “meaningful human involvement”, even if there was no active review by a human decision-maker. New Article 22D(2) similarly allows the Secretary of State to make regulations to determine whether a decision made had a “similarly significant effect” to a legal effect. For example, in summer 2021 there was the A-level algorithm grading scandal. If something like that were to reoccur, under this new power a Minister could lay regulations stating that the decision to use an algorithm in grading A-levels was not a decision with a “similarly significant effect”.

New Article 22D(4) also allows the Secretary of State to add or remove, via regulations, any of the listed safeguards for automated decision-making. If the Government wish to amend or remove safeguards on automated decision-making, that should also be specified in the Bill and not left to delegated legislation. Amendments 53 to 55 and 69 to 72 would limit the Secretary of State’s power, so that they may add safeguards but cannot vary or remove those in the new Article 22D, as they stand, when the legislation comes into force.

If the clause is to be retained, we support Amendment 59A in the name of the noble Lord, Lord Holmes, which requires the Information Commissioner’s Office to develop guidance on the interpretation of the safeguards in new Article 22C and on important terms such as “similarly significant effect” and “meaningful human involvement”. It is within the Information Commissioner’s Office’s duties to issue guidance and to harmonise the interpretation of the law. As the dedicated regulator, the ICO is best placed and equipped to publish guidance and ensure consistency of application.

As a way to increase protections and incorporate more participation from those affected, Amendment 59A would add a new paragraph (7) to new Article 22D, which specifies that the Secretary of State needs to consult with the Information Commissioner’s Office if developing regulations. It also includes an obligation for the Secretary of State to consult with data subjects or their representatives, such as trade union or civil society organisations, at least every two years from the commencement of the Bill.

Our preference is for Clause 14 not to stand part of the Bill. The deployment of automated decision-making under Clause 14 risks automating harm, including discrimination, without adequate safeguards. Clause 14 creates a new starting point for all ADM using personal, but not special category, data. It is allowed, including for profiling, provided that certain safeguards are in place. The Minister said those safeguards are “appropriate” and “robust” and provide “certainty”, but I preferred what the noble Lord, Lord Bassam, said about the clause:

“We need more safeguards. We have moved from one clear position to another, which can be described as watering down or shifting the goalposts”.—[Official Report, 25/3/24; col. GC 150.]


That is very much my feeling about the clause as well.

I refer back to the impact assessment, which we discussed at some point during our discussions about Clause 9. It is very interesting that, in table 15 of the impact assessment, the savings on compliance costs are something like £7.3 million as regards AI and machine learning, which does not seem a very big number compared with the total savings on compliance costs, which the Government have put rather optimistically at £295 million.

In passing, I should say that, when I look at the savings regarding subject access requests, I see that the figure is £153 million, which is half of those so-called savings on compliance costs. I do not square that at all with what the Minister says about the total savings on compliance costs for subject access requests being 1%. I do not know quite where those figures come from, but it is a far more significant percentage: it is 50% of what the Government believe that the savings on compliance costs will be. I know that it is not part of this group, but I would be very grateful if the Minister could write to clarify that issue in due course.

Although the Minister has called these adequate, we believe that they are inadequate for three reasons. First, they shift the burden to the individual. Secondly, there is no obligation to provide any safeguards before the decision is made. Neither the Bill nor any of the material associated with it indicates what the content of this information is expected to be, nor the timescales in which that information is to be given. There is nothing to say when representations or contest may be heard, when human intervention may be sought or the level of that intervention. Thirdly, the Secretary of State has delegated powers to vary the safeguards by regulations.

Article 22 is currently one of the strongest prohibitions in the GDPR. As we know, the current starting point is that using solely automated decision-making is prohibited unless certain exemptions apply. The exemptions are limited. Now, as a result of the Government’s changes, you can use solely automated decision-making in an employment context in the UK, which you cannot do in the EU. That is a clear watering down of the restriction. The Minister keeps returning to the safeguards, but I have referred to those. We know that they are not being applied in practice even now and that hiring and firing is taking place without any kind of human review.

There is therefore an entirely inadequate basis on which we can be satisfied that the Bill will safeguard individuals from harmful automated decision-making before it is too late. In fact, the effect of the Bill will be to do the opposite: to permit unfair and unsafe ADM to occur, including discriminatory profiling ADM, which causes harm to individuals. It then places the burden on the individual to complain, without providing for any adequate safeguards to guarantee their ability to do so before the harm is already incurred. While I beg to move Amendment 53, our preference would be that Clause 14 is deleted from the Bill entirely.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I will speak to Amendment 57 in my name, Amendment 59 in the name of the noble Baroness, Lady Jones, and the Clause 14 stand part notice from the noble Lord, Lord Clement-Jones. In doing so, I register my support for Amendment 59A in the name of the noble Lord, Lord Holmes.

The Government assert that there is no diminution of rights in the Bill, yet Clause 14 removes the right not to be subject to an automated decision and replaces that right with inadequate safeguards, as the noble Lord, Lord Clement-Jones, said. On the previous day in Committee, the Minister made the argument that:

“These reforms clarify and simplify the rules related to solely automated decision-making without watering down any of the protections for data subjects or the fundamental data protection principles”,—[Official Report, 25/3/24; col. GC 146.]


but I hope he will at least accept that safeguards do not constitute a right. The fact that the Secretary of State has delegated powers to change the safeguards at will undermines his argument that UK citizens have lost nothing at all; they have lost the right not to be subject to an automated decision.

The fact that the Government have left some guard-rails for special category data is in itself an indication that they know they are downgrading UK data rights, because the safeguards in place are not adequate. If they were adequate, it would be unnecessary to separate out SPC data in this way. I hammer the point home by asking the Minister to explain how the protections will work in practice in an era of AI when risks can come from inference and data analytics that do not use special category data but will still have a profound impact on the work lives, health, finances and opportunities of data subjects. If it is the case that data about your neighbourhood, shopping habits, search results, steps or entertainment choices is used to infer an important decision, how would a data subject activate their rights in that case?

As an illustration of this point, the daughter of a colleague of mine, who, as it happens, has a deep expertise in data law, this year undertook a video-based interview for a Russell group university with no human contact. It was not yet an ADM system, but we are inching ever closer to it. Removing the right, as the Government propose, would place the onus on students to complain or intervene—in a non-vexatious manner, of course. Will the Minister set out how UK citizens will be protected from life-changing decisions after government changes to Article 22, particularly as, in conjunction with other changes such as subject access requests and data impact assessments, UK citizens are about to have fewer routes to justice and less transparency of what is happening to their data?

I would also be grateful if the Minister could speak to whether he believes that the granularity and precision of current profiling deployed by AI and machine learning is sufficiently guaranteed to take this fundamental right away. Similarly, I hope that the known concerns about bias and fairness in ADM will be resolved over time, but we are not there yet, so why is it that the Government have a wait-and-see policy on regulation but are not offering the same “wait and see” in relation to data rights?

On Amendment 59 in the name of the noble Baroness, Lady Jones, the number of workers anticipated to be impacted by AI is simply eye-watering. In last Friday’s debate on AI, it was said to be 300 million worldwide, and one in four across Europe. But how workers work with AI is not simply a scary vision of the near future; it is here now.

I have a family member who last year left an otherwise well-paid and socially useful job when they introduced surveillance on to his computer during his working from home. At the time, he said that the way in which it impacted on both his self-esteem and autonomy was so devastating that he felt like

“a cog in a machine or an Amazon worker with no agency or creativity”.

He was an exemplary employee: top of the bonus list and in all measurable ways the right person in the right job. Efficiency in work has a vital role but it is not the whole picture. We know that, if able and skilled workers lose their will to work, it comes at a considerable cost to the well-being of the nation and the public purse. Most jobs in future will involve working with or even collaborating with technology; ensuring that work is dignified and fair to the human components of this arrangement is not a drag on productivity but a necessity if society is to benefit from changes to technology.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Certainly. Being prescriptive and applying one-size-fits-all measures for all processes covered by the Bill encourages organisations to follow a process, but focusing on outcomes encourages organisations to take better ownership of the outcomes and pursue the optimal privacy and safety mechanisms for those organisations. That is guidance that came out very strongly in the Data: A New Direction consultation. Indeed, in the debate on a later group we will discuss the use of senior responsible individuals rather than data protection officers, which is a good example of removing prescriptiveness to enhance adherence to the overall framework and enhance safety.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

This seems like a very good moment to ask whether, if the variation is based on outcome and necessity, the Minister agrees that the higher bar of safety for children should be specifically required as an outcome.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I absolutely agree about the outcome of higher safety for children. We will come to debate whether the mechanism for determining or specifying that outcome is writing that down specifically, as suggested.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I am sure the Minister knew I was going to stand up to say that, if it is not part of the regulatory instruction, it will not be part of the outcome. The point of regulation is to determine a floor— never a ceiling—below which people cannot go. Therefore, if we wish to safeguard children, we must have that floor as part of the regulatory instruction.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Indeed. That may well be the case, but how that regulatory instruction is expressed can be done in multiple ways. Let me continue; otherwise, I will run out of time.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Let me make the broad point that there is no single list of outcomes for the whole Bill but, as we go through clause by clause, I hope the philosophy behind it, of being less prescriptive about process and more prescriptive about the results of the process that we desire, should emerge—not just on Clause 14 but as the overall philosophy underlying the Bill. Regulation-making powers can also be used to vary the existing safeguards, add additional safeguards and remove additional safeguards added at a later date.

On the point about having regard, it is important that the law is drafted in a way that allows it to adapt as technology advances. Including prescriptive requirements in the legislation reduces this flexibility and undermines the purpose of this clause and these powers to provide additional legal clarity when it is deemed necessary and appropriate in the light of the fast-moving advances in and adoption of technologies relevant to automated decision-making. I would like to reassure noble Lords that the powers can be used only to vary the existing safeguards, add additional ones and remove them. They cannot remove any of the safeguards written into the legislation.

Amendments 53 to 55 and 69 to 71 concern the Secretary of State powers relating to the terms “significant decisions” and “meaningful human involvement”. These powers enable the Secretary of State to provide a description of decisions that do or do not have a significant effect on data subjects, and describe cases that can be taken to have, or not to have, meaningful human involvement. As technology adoption grows and new technologies emerge, these powers will enable the Government to provide legal clarity, if and when deemed necessary, to ensure that people are protected and have access to safeguards when they matter most. In respect of Amendment 59A, Clause 50 already provides for an overarching requirement for the Secretary of State to consult the ICO and other persons the Secretary of State considers appropriate before making regulations under the UK GDPR, including for the measures within Article 22.

Also, as has been observed—I take the point about the limitations of this, but I would like to make the point anyway—any changes to the regulations are subject to the affirmative procedure and so must be approved by both Houses. As with other provisions of the Bill, the ICO will seek to provide organisations with timely guidance and support to assist them in interpreting and applying the legislation. As such, I would ask the noble Lord, Lord Clement Jones, and my noble friend Lord Holmes—were he here—not to press their amendments.

Amendment 57 in the name of the noble Baroness, Lady Kidron, seeks to ensure that, when exercising regulation-making powers in relation to the safeguards in Article 22 of the UK GDPR, the Secretary of State should uphold the level of protection that children are entitled to in the Data Protection Act 2018. As I have said before, Clause 50 requires the Secretary of State to consult the ICO and other persons he or she considers appropriate. The digital landscape and its technologies evolve rapidly, presenting new challenges in safeguarding children. Regular consultations with the ICO and stakeholders ensure that regulations remain relevant and responsive to emerging risks associated with solely automated decision-making. The ICO has a robust position on the protection of children, as evidenced through its guidance and, in particular, the age-appropriate design code. As such, I ask the noble Baroness not to press her amendment.

Amendments 58, 72 and 73 seek to prevent the Secretary of State varying any of the safeguards mentioned in the reformed clauses. As I assured noble Lords earlier, the powers in this provision can be used only to vary the existing safeguards, add additional safeguards and remove additional safeguards added by regulation in future; there is not a power to remove any of the safeguards.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I apologise for breaking the Minister’s flow, especially as he had moved on a little, but I have a number of questions. Given the time, perhaps he can write to me to answer them specifically. They are all designed to show the difference between what children now have and what they will have under the Bill.

I have to put on the record that I do not accept what the Minister just said—that, without instruction, the ICO can use its old instruction to uphold the current safety for children—if the Government are taking the instruction out of the Bill and leaving it with the old regulator. I ask the Minister to tell the Committee whether it is envisaged that the ICO will have to rewrite the age-appropriate design code to marry it with the new Bill, rather than it being the reason why it is upheld. I do not think the Government can have it both ways where, on the one hand, the ICO is the keeper of the children, and, on the other, they take out things that allow the ICO to be the keeper of the children in this Bill.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I absolutely recognise the seriousness and importance of the points made by the noble Baroness. Of course, I would be happy to write to her and meet her, as I would be for any Member in the Committee, to give—I hope—more satisfactory answers on these important points.

As an initial clarification before I write, it is perhaps worth me saying that the ICO has a responsibility to keep guidance up to date but, because it is an independent regulator, it is not for the Government to prescribe this, only to allow it to do so for flexibility. As I say, I will write and set out that important point in more detail.

Amendment 59 relates to workplace rights. I reiterate that the existing data protection legislation and our proposed reforms—

--- Later in debate ---
I am grateful for the support of the noble Baroness, Lady Bennett, for this particular amendment, alongside another noble Lord who no doubt will reveal themself when I finally find my way through this list of amendments. In the meantime, I beg to move.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I speak to Amendment 144 in my name, which is supported by the noble Baronesses, Lady Harding and Lady Jones, and the noble Lord, Lord Clement-Jones. The amendment would introduce a code of practice on children and AI. Before I speak to it, I declare an interest: I am working with academic NGO colleagues in the UK, EU and US on such a code, and I am part of the UN Secretary-General’s AI advisory body’s expert group, which is currently working on sections on both AI and children and AI and education.

AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, people they follow and products they buy. But it no longer concerns simply the elective parts of life where, arguably, a child—or a parent on their behalf—can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors—the first of which is compulsory and the second of which is necessary.

The proposed code has three parts. The first requires the ICO to create the code and sets out expectations of its scope. The second considers who and what should be consulted and considered, including experts, children and the frameworks that codify children’s existing rights. The third defines elements of the process, including risk assessment, defines language and puts the principles to which the code must adhere in the Bill.

I am going to get my defence in early. I anticipate that the Minister will say that the ICO has published guidance, that we do not want to exclude children from the benefits of AI and that we are in a time of “wait and see”. He might even ask why children need something different or why the AADC, which I mention so frequently, is not sufficient. Let me take each of those in turn.

On the sufficiency of the current guidance, the ICO’s non-binding Guidance on AI and Data Protection, which was last updated on 15 March 2023, has a single mention of a child in its 140 pages, in a case study about child benefits. The accompanying AI and data protection toolkit makes no mention of children, nor does the ICO’s advice to developers on generative AI, issued on 3 April 2023. There are hundreds of pages of guidance but it fails entirely to consider the specific needs of children, their rights, their development vulnerabilities or that their lives will be entirely dominated by AI systems in a way that is still unimaginable to those in this Room. Similarly, there is little mention of children in the Government’s own White Paper on AI. The only such references are limited to AI-generated child sexual abuse material; we will come to that later when we discuss Amendment 291. Even the AI summit had no main-stage event relating to children.

Of course we do not want to exclude children from the benefits of AI. A code on the use of children’s data in the development and deployment of AI technology increases their prospects of enjoying the benefits of AI while ensuring that they are protected from the pitfalls. Last week’s debate in the name of the noble Lord, Lord Holmes, showed the broad welcome of the benefits while urgently speaking to the need for certain principles and fundamental protections to be mandatory.

As for saying, “We are in a time of ‘wait and see’”, that is not good enough. In the course of this Committee, we will explore edtech that has only advertising and no learning content, children being left out of classrooms because their parents will not accept the data leaks of Google Classroom, social media being scraped to create AI-generated CSAM and how rapid advances in generative AI capabilities mark a new stage in its evolution. Some of the consequences of that include ready access to models that create illegal and abusive material at scale and chatbots that offer illegal or dangerous advice. Long before we get on to the existential threat, we have “here and now” issues. Childhood is a very short period of life. The impacts of AI are here and now in our homes, our classrooms, our universities and our hospitals. We cannot afford to wait and see.

Children are different for three reasons. First, as has been established over decades, there are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony, and learn different social skills. This means that, equally, there are ages and stages at which they cannot do that. The long-established consensus is that family, social groups and society more broadly—including government—step in to support that journey.

Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces that they inhabit have to be fit for childhood.

Thirdly, we have a responsibility towards children that extends even beyond our responsibilities to each other; this means that it is not okay for us to legitimise profit at their expense, whether it is allowing an unregulated edtech market that exploits their data and teaches them nothing or the untrammelled use of their pictures to create child sexual abuse material.

Finally, what about the AADC? I hope that, in the course of our deliberations, we will put that on a more secure footing. The AADC addresses recommender systems in standard 12. However, the code published in August 2020 does not address generative AI which, as we have repeatedly heard, is a game-changer. Moreover, the AADC is currently restricted to information society services, which leaves a gaping hole. This amendment would address this gap.

There is an argument that the proposed code could be combined with the AADC as an update to its provisions. However, unless and until we sort out the status of the AADC in relation to the Bill, an AI kids code would be better formed as a stand-alone code. A UK code of practice on children and AI would ensure that data processors consider the fundamental rights and freedoms of children, including their safety, as they develop their products and perhaps even give innovators the appetite to innovate with children in mind.

As I pointed out at the beginning, there are many people globally working on this agenda. I hope that as we are the birthplace of the AADC and the Online Safety Act, the Government will adopt this suggestion and again be a forerunner in child privacy and safety. If, however, the Minister once again says that protections for children are not necessary, let me assure him that they will be put in place by others, and we will be a rule taker not a rule maker.

Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, I rise with the advantage over the noble Lord, Lord Clement-Jones, in that I will speak to only one amendment in this group; I therefore have the right page in front of me and can note that I will speak to Amendment 252, tabled by the noble Lord, Lord Clement-Jones, and signed by me and the noble Lords, Lord Watson of Wyre Forest and Lord Maude of Horsham.

I apologise that I was not with the Committee earlier today, but I was chairing a meeting about the microbiome, which was curiously related to this Committee. One issue that came up in that meeting was data and data management and the great uncertainties that remain. For example, if a part of your microbiome is sampled and the data is put into a database, who owns that data about your microbiome? In fact, there is no legal framework at the moment to cover this. There is a legal framework about your genome, but not your microbiome. That is a useful illustration of how fast this whole area is moving and how fast technology, science and society are changing. I will actually say that I do not blame the Government for the fact of this gaping hole as it is an international hole. It is a demonstration of how we need to race to catch up as legislators and regulators to deal with the problem.

This relates to Amendment 252 in the sense that perhaps this is an issue that has arisen over time, kind of accidentally. However, I want to credit a number of campaigners, among them James O’Malley, who was the man who draw my attention to this issue, as well as Peter Wells, Anna Powell-Smith and Hadley Beeman. They are people who have seen a really simple and basic problem in the way that regulation is working and are reaching out including, I am sure, to many noble Lords in this Committee. This is a great demonstration of how campaigning has at least gone part of the way to working. I very much hope that, if not today, then some time soon, we can see this working.

What we are talking about here, as the noble Lord, Lord Clement-Jones, said, is the postal address file. It is held as a piece of private property by Royal Mail. It is important to stress that this is not people’s private information or who lives at what address; it is about where the address is. As the noble Lord, Lord Clement-Jones, set out, all kinds of companies have to pay Royal Mail to have access to this basic information about society, basic information that is assembled by society, for society.

The noble Lord mentioned Amazon having to pay for the file. I must admit that I feel absolutely no sympathy there. I am no fan of the great parasite. It is an interesting contrast to think of Amazon paying, but also to think of an innovative new start-up company, which wants to be able to access and reach people to deliver things to their homes. For this company, the cost of acquiring this file could be prohibitive. It could stop it getting started and competing against Amazon.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I believe that the AADC already has statutory standing.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

On that point, I think that the Minister said—forgive me if I am misquoting him —risk, rules and rights, or some list to that effect. While the intention of what he said was that we have to be careful where children are using it, and the ICO has to make them aware of the risks, the purpose of a code—whether it is part of the AADC or stand-alone—is to put those responsibilities on the designers of service products and so on by default. It is upstream where we need the action, not downstream, where the children are.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, I entirely agree with that, but I add that we need it upstream and downstream.

For the reasons I have set out, the Government do not believe that it would be appropriate to add these provisions to the Bill at this time without further detailed consultation with the ICO and the other organisations involved in regulating AI in the United Kingdom. Clause 33—

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

Can we agree that there will be some discussions with the ICO between now and Report? If those take place, I will not bring this point back on Report unnecessarily.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Yes, I am happy to commit to that. As I said, we look forward to talking with the noble Baroness and others who take an interest in this important area.

Clause 33 already includes a measure that would allow the Secretary of State to request the ICO to publish a code on any matter that she sees fit, so this is an issue that we could return to in the future, if the evidence supports it, but, as I said, we consider the amendments unnecessary at this time.

Finally, Amendment 252 would place a legislative obligation on the Secretary of State regularly to publish address data maintained by local authorities under open terms—that is, accessible by anyone for any purpose and for free. High-quality, authoritative address data for the UK is currently used by more than 50,000 public and private sector organisations, which demonstrates that current licensing arrangements are not prohibitive. This data is already accessible for a reasonable fee from local authorities and Royal Mail, with prices starting at 1.68p per address or £95 for national coverage.

--- Later in debate ---
Moved by
79: Clause 15, page 30, line 37, at end insert—
“(ba) in paragraph 3(c) for “Article 32” substitute “Articles 25 and 32””Member’s explanatory statement
This amendment would add data protection by design as an additional measure for processors, to ensure that they are accountable for the design of their systems and services, noting the challenge that controllers often face when engaging processors for services such as AI and cloud computing and what influence they can have on the design.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I will speak to a number of amendments in this group—Amendments 79, 83, 85, 86, 96, 97, 105 and 107.

Amendment 79 proposes an addition to the amendments to Article 28 of the UK GDPR in Clause 15(4). Article 28 sets out the obligations on processors when processing personal data on behalf of controllers. Currently, paragraph 3(c) requires processors to comply with Article 32 of the UK GDPR, which relates to data security. Amendment 79 adds the requirement for processors also to comply with the privacy-by-design provision in Article 25. Article 25 requires controllers to

“at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects”.

I am not proposing an abdication of responsibility by the controller when it instructs a processor to act on its behalf but, in practice, it is hard for a controller to meet this responsibility at the time of processing if it has delegated the processing to a third party that is not bound by the same requirement. I am not normally associated with the edtech sector, but the amendment is of particular importance to it as schools are controllers but the data of children is being processed.

The amendment ensures that processors would be contractually committed to complying with Article 25. It is particularly relevant to situations where controllers procure AI systems, including facial recognition technology and edtech products. It would be helpful in both the public and private sectors and would address the power asymmetry between controller and processor when the processor is a multinational and solutions are often presented on a take-it-or-leave-it basis.

I hope noble Lords will forgive me if I take Amendment 97 out of turn, as all the others in my name relate to children’s data, whereas Amendment 97, like Amendment 79, applies to all data subjects. Amendment 97 would require public bodies to publish risk assessments to create transparency and accountability. This would also place in statute a provision that is already contained in the ICO’s freedom of information publication scheme guidance. The amendment would also require the Cabinet Office to create and maintain an accessible register of public sector risk assessments to improve accountability.

In the last group, we heard that the way in which public bodies collect and process personal data has far-reaching consequences for all of us. I was moved to lay this amendment after witnessing some egregious examples from the education system. The public have a right to know how bodies such as health authorities, schools, universities, police forces, local authorities and government departments comply with their obligations under UK data law. This amendment is simply about creating trust.

The child-related amendments in this group are in my name and those of the noble Lord, Lord Clement-Jones, and the noble Baronesses, Lady Harding and Lady Jones. Clause 17 sets out the obligations for the newly created role of “senior responsible individual”, which replaces the GDPR requirement to appoint a data protection officer. The two roles are not equivalent: a DPO is an independent adviser to senior management, while a senior responsible individual would be a member of senior management. Amendment 83 would ensure that those appointed senior responsible individuals have an understanding of the heightened risks and the protections to which children are entitled.

Over the years, I have had many conversations with senior executives at major tech companies and, beyond the lines prepared by their public affairs teams, their understanding of children’s protection is often superficial and their grasp of key issues very limited. In fact, if I had a dollar for every time a tech leader, government affairs person or engineer has said, “I never thought of it that way before”, I would be sitting on quite a fortune.

Amendment 83 would simply ensure that a senior leader who is tasked with overseeing compliance with UK data law knows what he or she is talking about when it comes to children’s privacy, and that it informs the decisions they make. It is a modest proposal, and I hope the Minister will find a way to accept it.

Amendments 85 and 86 would require a controller to consider children’s right to higher standards of privacy than adults for their personal data when carrying out its record-keeping duties. Specifically, Amendment 85 sets out what is appropriate when maintaining records of high-risk processing and Amendment 87 relates to processing that is non-high risk. Creating an express requirement to include consideration of these rights in a data controller’s processing record-keeping obligation is a simple but effective way of ensuring that systems and processes are designed with the needs and rights of children front of mind.

Clause 20 is one of the many fault lines where the gap between the assurances given that children will be just as safe and the words on the page is clear. I make clear that the amendments to Clause 18 that I put forward are, as the noble Lord, Lord Clement-Jones, said on Monday, belt and braces. They do not reach the standard of protection that children currently enjoy under the risk-assessment provisions in Article 35 of the UK GDPR and the age-appropriate design code.

A comparison of what controllers must include in a data protection impact assessment under Article 35(7) and what they would need to cover in an assessment of high-risk processing under Clause 20(3)(d) shows the inadequacies of the latter. Instead of a controller having to include

“a systematic description of the envisaged processing operations and the purposes of the processing, including, where applicable, the legitimate interest pursued by the controller”,

under the Bill, the controller needs to include only

“a summary of the purposes of the processing”.

They need to include no systematic description—just a summary. There is no obligation to include information about the processing operations or to explain when and how the controller has determined they are entitled to rely on legitimate interest purpose. Instead of

“an assessment of the necessity and proportionality of the processing operations in relation to the purposes”,

under the Bill, a controller needs to assess only necessity, not proportionality. Instead of

“an assessment of the risks to the rights and freedoms of data subjects”,

under the Bill, a controller does not need to consider rights and freedoms.

As an aside, I note that this conflicts with the proposed amendments to Section 64 of the Data Protection Act 2018 in Clause 20(7)(d), which retains the “rights and freedoms” wording but otherwise mirrors the new downgraded requirements in Clause 20(3)(d). I would be grateful for clarification from the Minister on this point.

Instead of requiring the controller to include information about

“the measures envisaged to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data and to demonstrate compliance with this Regulation taking into account the rights and legitimate interests of data subjects and other persons concerned”,

as currently prescribed in Article 35, under the Bill, the controller needs to provide only

“a description of how the controller proposes to mitigate those risks”.

The granularity of what is currently required is replaced by a generalised reference to “a description”. These are not the same bar. My argument throughout Committee is that we need to maintain the bar for processing children’s data.

--- Later in debate ---
Lord Harlech Portrait Lord Harlech (Con)
- Hansard - - - Excerpts

My Lords, just for clarification, because a number of questions were raised, if the Committee feels that it would like to hear more from the Minister, it can. It is for the mood of the Committee to decide.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I would like to hear from the Minister.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Yes. We will not stand on ceremony.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I apologise for going over. I will try to be as quick as possible.

I turn now to the amendments on the new provisions on assessments of high-risk processing in Clause 20. Amendments 87, 88, 89, 91, 92, 93, 94, 95, 97, 98 and 101 seek to reinstate requirements in new Article 35 of the UK GDPR on data protection impact assessments, and, in some areas, make them even more onerous for public authorities. Amendment 90 seeks to reintroduce a list of high-risk processing activities drawn from new Article 35, with a view to help data controllers comply with the new requirements on carrying out assessments of high-risk processing.

Amendment 96, tabled by the noble Baroness, Lady Kidron, seeks to amend Clause 20, so that, where an internet service is likely to be accessed by children, the processing is automatically classed as high risk and the controller must do a children’s data protection impact assessment. Of course, I fully understand why the noble Baroness would like those measures to apply automatically to organisations processing children’s data, and particularly to internet services likely to be accessed by children. It is highly likely that many of the internet services that she is most concerned about will be undertaking high-risk activities, and they would therefore need to undertake a risk assessment.

Under the current provisions in Clause 20, organisations will still have to undertake risk assessments where their processing activities are likely to pose high risks to individuals, but they should have the ability to assess the level of risk based on the specific nature, scale and context of their own processing activities. Data controllers do not need to be directed by government or Parliament about every processing activity that will likely require a risk assessment, but the amendments would reintroduce a level of prescriptiveness that we were seeking to remove.

Clause 20 requires the ICO to publish a list of examples of the types of processing activities that it considers would pose high risks for the purposes of these provisions, which will help controllers to determine whether a risk assessment is needed. This will provide organisations with more contemporary and practical help than a fixed list of examples in primary legislation could. The ICO will be required to publish a document with a list of examples that it considers to be high-risk processing activities, and we fully expect the vulnerability age of data subjects to be a feature of that. The commissioner’s current guidance on data protection impact assessments already describes the use of the personal data of children or other vulnerable individuals for marketing purposes, profiling or offering internet services directly to children as examples of high-risk processing, although the Government cannot of course tell the ICO what to include in its new guidance.

Similarly, in relation to Amendments 99, 100 and 102 from the noble Baroness, Lady Jones, it should not be necessary for this clause to specifically require organisations to consider risks associated with automated decision-making or obligations under equalities legislation. That is because the existing clause already requires controllers to consider any risks to individuals and to describe

“how the controller proposes to mitigate those risks”.

I am being asked to wrap up and so, in the interests of time, I shall write with my remaining comments. I have no doubt that noble Lords are sick of the sound of my voice by now.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I hope that no noble Lord expects me to pull all that together. However, I will mention a couple of things.

With this group, the Minister finally has said all the reasons why everything will be different and less. Those responsible for writing the Minister’s speeches should be more transparent about the Government’s intention, because “organisations are best placed to determine what is high-risk”—not the ICO, not Parliament, not existing data law. Organisations are also for themselves. They are “best placed to decide on their representation”, whether it is here or there and whether it speaks English or not, and they “get to decide whether they have a DPO or a senior responsible individual”. Those are three quotes from the Minister’s speech. If organisations are in charge of the bar of data protection and the definition of data protection, I do believe that this is a weakening of the data protection regime. He also said that organisations are responsible for the quality of their risk assessment. Those are four places in this group alone.

At the beginning, the noble Baroness, Lady Harding, talked about the trust of consumers and citizens. I do not think that this engenders trust. The architecture is so keen to get rid of ways of accessing rights that some organisations may have to have a DPO and a DPIA—a doubling rather than a reducing of burden. Very early on—it feels a long time ago—a number of noble Lords talked about the granular detail. I tried in my own contribution to show how very different it is in detail. So I ask the Minister to reflect on the assertion that you can take out the detail and have the same outcome. All the burden being removed is on one side of the equation, just as we enter into a world in which AI, which is built on people’s data, is coming in the other direction.

I will of course withdraw my amendment, but I believe that Clauses 20, 18 and the other clauses we just discussed are deregulation measures. That should be made clear from the Dispatch Box, and that is a choice that the House will have to make.

Before I sit down, I do want to recognise one thing, which is that the Minister said that he would work alongside us between now and Report; I thank him for that, and I accept that. I also noted that he said that it was a responsibility to take care of children by default. I agree with him; I would like to see that in the Bill. I beg leave to withdraw my amendment.

Amendment 79 withdrawn.
--- Later in debate ---
Moved by
103: Clause 20, page 41, line 34, at end insert—
“(e) a description of how the controller will enforce purpose limitation, and(f) evidence of how individual information rights are enabled at the point of collection and after processing (if subsection (3A) is not routinely applied)”Member's explanatory statement
Large language models are accessing data that includes personal data. There are existing web protocols that can prevent this, but they are little known, difficult to navigate and require an opt out. This amendment and another in my name to Clause 20 would require either proof of legitimate interest, or prior permission from data subjects unless the company routinely give an easily accessible machine readable opt in.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I am somewhat disappointed to be talking to these amendments in the dying hours of our Committee before we take a break because many noble Lords—indeed, many people outside the House—have contacted me about them. I particularly want to record the regret of the noble Lord, Lord Black, who is a signatory to these amendments, that he is unable to be with us today.

The battle between rights-holders and the tech sector is nothing new. Many noble Lords will remember the arrival and demise of file-sharing platform Napster and the subsequent settlement between the sector and the giant creative industries. Napster argued that it was merely providing a platform for users to share files and was not responsible for the actions of its users; the courts sided with the music industry, and Napster was ordered to shut down its operations in 2001. The “mere conduit” argument was debunked two decades ago. To the frustration of many of us, the lawsuits led to a perverse outcome that violent bullying or sexually explicit content would be left up for days, weeks or forever, while a birthday video with the temerity to have music in the background would be deleted almost immediately.

The emergence of the large language models—LLMs—and the desire on the part of LLM developers to scrape the open web to capture as much text, data and images as possible raise some of the same issues. The scale of scraping is, by their own admission, unprecedented, and their hunger for data at any cost in an arms race for AI dominance is publicly acknowledged, setting up a tension between the companies that want the data and data subjects and creative rights holders. A data controller who publishes personal data as part of a news story, for example, may do so on the basis of an exemption under data protection law for journalism, only for that data to be scraped and commingled with other data scraped from the open web to train an LLM.

This raises issues of copyright infringement and, more importantly—whether for individuals, creative communities or businesses that depend on the value of what they produce—these scraping activities happen invisibly. Anonymous bots acting on behalf of AI developers, or conducting a scrape as a potential supplier to AI developers, are scraping websites without notifying data controllers or data subjects. In doing so, they are also silent on whether processes are in place to minimise risks or balance competing interests, as required by current data law.

Amendment 103 would address those risks by requiring documentation and transparency. Proposed new paragraph (e) would require an AI developer to document how the data controller will enforce purpose limitation. This is essential, given that invisible data processing enabled through web scraping can pick up material that is published for a legitimate purpose, such as journalism, but the combination of such information with other data accessed through invisible data processing could change the purpose and application of that data in ways that the individual may wish to object to using their existing data rights. Proposed new paragraph (f) would require a data processor seeking to use legitimate interest as the basis for web scraping and invisible processing to build LLMs to document evidence of how they have ensured that individual information rights have been enabled at the point of collection and after processing.

Together, those proposed new paragraphs would mean that anyone who scrapes web data must be able to show that the data subjects have meaningful control and can access their information rights ahead of processing. These would be mandatory, unless they have incorporated an easily accessible machine-readable protocol on an opt-in basis, which is then the subject of Amendment 104.

Amendment 104 would require web scrapers to establish an easily accessible machine-readable protocol that works on an opt-in basis rather than the current opt-out. Undoubtedly, the words “easily”, “accessible”, “machine readable” and “web protocols” would all benefit from guidance from the ICO but, for the absence of doubt, the intention of the amendment is that a web scraper would proactively notify individuals and website owners that scraping of their data will take place, including stating the identity of the data processor and the purpose for which that data is to be scraped. In addition, the data processor will provide information on how data subjects and data controllers can exercise their information rights to opt out of their data being scraped before any such scraping takes place, with an option to object after the event if taken without permission.

We are in a situation in which not only is IP being taken at scale, potentially impoverishing our very valuable creative industries, journalism and academic work that is then regurgitated inaccurately, but which is making a mockery of individual data rights. In its recent consultation into the lawful basis for web scraping, the ICO determined that use of web-scraped data

“can be feasible if generative AI developers take their legal obligations seriously and can evidence and demonstrate this in practice”.

These amendments would operationalise that demonstration. As it stands, there is routine failure, particularly regarding new models. For example, the ICO’s preliminary enforcement notice against Snap is that its risk assessment for its AI tool was inadequate.

Noble Lords will appreciate the significance of the connection that the ICO draws between innovative technology and children’s personal data, given the heightened data rights and protections that children are afforded under the age-appropriate design code. While I welcome the ICO’s action, holders of intellectual copyright have been left to fend for themselves, since government talks have failed and individual data subjects are left exposed. Whether it is the scraping of social media or work and school websites, these will not be pursued by the ICO because regulating action in such small increments is disproportionate, yet this lack of compliance is happening at scale.

--- Later in debate ---
For these reasons, I am not able to accept these amendments. I am of course willing to continue to engage with all Members of the Committee, but I hope that the noble Baroness will withdraw her amendment.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Jones, for their support. I will read the Minister’s speech, because this is a somewhat technical matter. I am not entirely sure that I agree with what he said, but I am also not sure that I could disagree with it adequately in the moment.

I will make two general points, however. First, I hear the Minister loud and clear on the question of the Government’s announcement on AI and IP but, at the beginning of my speech, I referenced Napster and how we ended up with personal data. The big guys won the battle for copyright, so we will see the likes of the New York Times, EMI and so on winning this battle, but small creatives and individuals will not be protected. I hope that, when that announcement comes, it includes the personal data issue as well.

Secondly, I say to the Minister that, if it is working now in the way he outlined from the ICO, then I do not think anybody thinks it is working very well. Either the ICO needs to do something, or we need to do something in this Bill. If not, we are letting all our data be taken for free to build the new world with no permission.

I know that the noble Viscount is interested in this area. It is one in which we could be creative. I suggest that we try to solve the conundrum about whether the ICO is not doing its work or we are not doing ours. I beg leave to withdraw my amendment.

Amendment 103 withdrawn.