Committee (3rd Day)
15:45
Relevant documents: 3rd Report from the Constitution Committee and 9th Report from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
Lord Haskel Portrait The Deputy Chairman of Committees (Lord Haskel) (Lab)
- Hansard - - - Excerpts

My Lords, if there is a Division in the Chamber while we are sitting, the Committee will adjourn as soon as the Division Bells are rung and resume after 10 minutes.

Debate on Amendment 87 resumed.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.

The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.


As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.

We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:

“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.


Indeed, she put forward an amendment:

“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]


I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.

Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.

The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.

The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.

The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.

Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.

As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on

“a reasonable and proportionate search”.

The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.

Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,

“including to equality and the environment”,

when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.

Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.

As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I will take any compliment.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.

I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.

Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.

The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:

“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.

16:00
Amendment 109 from the noble Lord, Lord Clement-Jones, would amend requirements for data protection impact assessments. The noble Lord will know that I and the Government share his concerns about the measures in the previous Government’s Data Protection and Digital Information Bill. I am therefore glad that this Bill does not include them. The existing provisions in the UK GDPR already require data controllers to carry out a data protection impact assessment when the processing is likely to result in high risks to the rights and freedoms of individuals. This would include, for example, a risk that a processing activity may give rise to discrimination. The assessment must contain, among other things, a description of safeguards to ensure protection of personal data. However, the Government would prefer to avoid requiring organisations to comply with even more rigorous requirements, such as the need to consider environmental impacts.
On EU data adequacy, I turn to Amendment 125, tabled by the noble Lord, Lord Clement-Jones. I agree with noble Lords on the need to maintain data adequacy, which is a priority for this Government. The free flow of personal data with our EU partners is vital in underpinning research and innovation and keeping people safe. For that reason, the Government are doing all that we can to support its swift renewal. I reassure noble Lords that the Bill has been designed with EU adequacy in mind. The Government have incorporated robust safeguards and changed proposals that did not serve our priorities and were of concern to the EU. It is, though, for the EU to undertake its review of the UK, which we are entering into now. On that basis, I suggest to noble Lords that we should respect that process and provide discretion and not interfere while it is under way.
I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Stevenson, Lord Clement-Jones and Lord Knight, for Amendments 109A, 139 and 140, concerning data communities. The Government firmly believe that giving data subjects greater agency over their personal data is important for strengthening data subject rights and for innovation and economic growth. Smart data schemes and digital verification services are good examples of such action arising from this Bill.
I reassure noble Lords that we continue to believe that this area should be further explored. The Government are in dialogue with businesses and innovators to develop collaborative, evidence-based interventions in this area. The UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. I am happy to update noble Lords on this in due course and invite the noble Baroness to meet to discuss this area further, if she would like to do so.
I turn to Amendments 154, 155 and 156, tabled by the noble Lord, Lord Clement-Jones, to the exemptions in Schedules 2 to 4 to the Data Protection Act 2018. Most of those exemptions have been in use since the Data Protection Act 1998. The noble Lord refers to the immigration exemption, which was amended following a court ruling specifically about that exemption. I reassure him that there is a power in the Data Protection Act to amend the other exemptions if necessary.
Given the above reassurances, I hope noble Lords will agree not to press their amendments in this group.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for her offer of a meeting. I could tell from the nods of my co-signatories that that would indeed be very welcome and we would all like to come. I was interested in the quote from the ICO about scraping. I doubt the Minister has it to hand, but perhaps she could write to say what volume of enforcement action has been taken by the ICO on behalf of data rights holders against scraping on that basis.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, it would be helpful if we could write and set that out in more detail. Obviously the ICO’s report is fairly recent, but I am sure he has considered how the enforcement would follow on from that. I am sure we can write and give more details.

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response. I wish to make three points. First, the critical question is: are our laws adequate to pass the adequacy test? Normally, when you go in for a legal test, you check that your own house is in order. I am therefore slightly disappointed by the response to Amendment 125. Normally one has the full-scale medical first, rather than waiting until you are found to be ill afterwards.

Secondly, I listened to what the Minister said about my Amendment 87 and the difference between what rights are protected by the charter and the much greater limitation of the ECHR, normally simply to do with the extent to which they apply horizontally to private individuals. I will look at her answer, but at first sight it does not seem right to me that, where you have fundamental rights, you move to a second stage of rights—namely, the rights under the Data Protection Act.

Thirdly, I want to comment on the whole concept of data communities and data trusts. This is an important area, and it takes me back to what I said last time: this legislation really needs trying to reduce to principles. I am going to throw out a challenge to the very learned people behind the Minister, particularly the lawyers: can they come up with something intelligible to the people who are going to do this?

This legislation is ghastly; I am sorry to say that, but it is. It imposes huge costs on SMEs—not to say on others, but they can probably afford it—and if you are going to get trust from people, you have to explain things in simple principles. My challenge to those behind the Minister is: can they draft a Clause 1 of the Bill to say, “The principles that underpin the Bill are as follows, and the courts are to interpret it in accordance with those principles”? That is my challenge—a challenge, as the noble Baroness, Lady Kidron, points out, to be ambitious and not to sit in a tepid bath. I beg leave to withdraw the amendment.

Amendment 87 withdrawn.
Amendments 88 and 89 not moved.
Clause 73 agreed.
Clause 74: Processing of special categories of personal data
Amendment 90 not moved.
Clause 74 agreed.
Clause 75: Fees and reasons for responses to data subjects’ requests about law enforcement processing
Amendment 91 not moved.
Clause 75 agreed.
Clause 76 agreed.
Clause 77: Information to be provided to data subjects
Amendment 92
Moved by
92: Clause 77, page 91, line 5, leave out “the number of data subjects,”
Member’s explanatory statement
This amendment reduces the likelihood of misuse of Clause 77 by AI model developers, who may otherwise seek to claim they do not need to notify data subjects of reuse for scientific purposes under Clause 77 because of the way that personal data is typically collected and processed for AI development, for example by scraping large amounts of personal data from the internet.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I have tabled Amendments 92, 93, 101 and 105, and I thank the noble Lord, Lord Clement-Jones for adding his name to them. I also support Amendment 137 in the name of my noble friend Lady Kidron.

Clause 77 grants an exemption to the Article 13 and 14 rights of data subjects to be told within a set timeframe that their data will be reused for scientific research, if it would be impossible or involve disproportionate effort to do so. These amendments complement those I proposed to Clause 67. They aim to ensure that “scientific research” is limited in its definition and that the large language AI developers cannot say that they are doing scientific research and that the GDPR requirements involve too much effort to have to contact data subjects to reuse their data.

It costs AI developers time and money to identify data subjects, so this exemption is obviously very valuable to them and they will use it if possible. They will claim that processing and notifying data subjects from such a huge collection of data is a disproportionate effort, as it is hard to extract the identity of data subjects from the original AI model.

Up to 5 million data subjects could be involved in reusing data to train a large language model. However, the ICO requires data controllers to inform subjects that their data could be reused even if it involves contacting 5 million data subjects. The criteria set out in proposed new subsection (6) in Clause 77 play straight into the hands of ruthless AI companies that want to take advantage of this exemption.

Amendments 92 and 101 would ensure that the disproportionate effort excuse is not used if the number of data subjects is mentioned as a reason for deploying the excuse. Amendments 93 and 105 would clarify the practices and facts that would not qualify for the disproportionate effort exemption—namely,

“the fact the personal data was not collected from the data subject, or any processing undertaken by the controller that makes the effort involved greater”.

Without this wording, the Bill will mean that the data controller, when wanting to reuse data for training another large language model, could process the personal data on the original model and then reuse it without asking permission from the original subjects. The AI developer could say, “I don’t have the original details of the data subject, as they were deleted when the original model was trained. There was no identification of the original data subjects; only the data weight”. I fear that many companies will use this excuse to get around GDPR notification expectations.

Noble Lords should recognise that these provisions affect only AI developers seeking to reuse data under the scientific research provisions. These will mainly be the very large AI developers, which tend to use scrape data to train their general purpose models. Controllers will still be able to use personal data to train AI systems when they have lawful grounds to do so—they either have the consent of the data subject or there is a legitimate interest—but I want to make it clear that these provisions will not inhibit the legitimate training of AI models.

These amendments would ensure that organisations, especially large language AI developers, are not able to reuse data at scale, in contradiction to the expectations and intentions of data subjects. Failure to get this right will risk setting off a public backlash against the use of personal data for AI use, which would impede this Government’s aims of making this country an AI superpower. I beg to move.

16:15
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.

This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.

Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.

The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.

When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.

I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.

Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?

Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.

Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.

It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.

This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.

Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.

It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.

I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.

Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.

What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I should first apologise for not being able to attend Second Reading or, arguably more importantly, to be in Committee last week to support the many amendments of the noble Baroness, Lady Kidron, on child protection. I read Hansard carefully and was deeply depressed to see that we were once again needing to rehearse, as she has done again today, the importance of protecting children in the digital era. It seems to be our lot that there is a group of us who keep coming back. We play the merry-go-round and sit in different places; it is a privilege to sit next to the noble Baroness, Lady Kidron, for the first time in the decade that I have been in the House. I support her Amendment 137. She has given a good exposé as to why we should think really carefully about how we protect children in this AI world. I would just like to add one point about AI itself.

We keep being told—in a good way—that AI is an underlying and general-purpose technology. That means we need to properly establish the principles with which we should protect children there. We know that technology is morally neutral; it is the human beings who do the damage. In every other underlying, breakthrough technology, we have learned that we have needed to protect the most vulnerable, whether it was electricity when it first went into factories, toys when they were first distributed on the mass market, or social media, with the age-appropriate design code. I feel that it would be a huge mistake, on the third Bill where many of us have debated this subject matter, for us not to address the fact that, as of today, this is the biggest breakthrough technology of our lifetime. We should recognise that children will need protecting, as well as having the opportunity to benefit from it.

16:30
Lord Kirkhope of Harrogate Portrait Lord Kirkhope of Harrogate (Con)
- Hansard - - - Excerpts

My Lords, I was not going to rise at all for the moment because there are other amendments coming later that are of interest. I declare my rather unusual interest: I was one of the architects of the GDPR in Brussels.

I rise to support Amendment 211A in the name of my noble friend Lord Holmes because here we are referring to AI. I know that other remarks have now been passed on this matter, which we will come to later, but it seems to me—this has come straight into my mind—that, when the preparation of the data legislation and the GDPR was being undertaken, we really did fail at that stage to accommodate the vast and important areas that AI brings to the party, as it were. We will fail again, I suspect, if we are not careful, in this piece of legislation. AI is with us now and moving at an enormous pace—faster than any legislator can ever manage to keep up with in order to control it and to make sure that there are sufficient protections in place for both the misuse of this technology and the way it may develop. So I support this amendment, particularly in relation to the trading or use of likenesses and the algorithmic effects that come about.

We will deal with that matter later, but I hope that the Minister will touch on this, particularly having heard the remarks of my noble friend Lord Holmes—and, indeed, the remarks of my noble friend Lady Harding a moment ago—because AI is missing. It was missing in the GDPR to a large extent. It is in the European Union’s new approach and its regulations on AI, but the EU has already shown that it has enormous difficulties in trying to offer, at one stage, control as well as redress and the proper involvement of human beings and individual citizens.

Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly to support my noble friend Lady Kidron on Amendment 137. The final comments from the noble and learned Lord, Lord Thomas, in our debate on the previous group were very apposite. We are dealing with a rapidly evolving and complex landscape, which AI is driving at warp speed. It seems absolutely fundamental that, given the panoply of different responsibilities and the level of detail that the different regulators are being asked to cover, there is on the face of what they have to do with children absolute clarity in terms of a code of practice, a code of conduct, a description of the types of outcomes that will be acceptable and a description of the types of outcomes that will be not only unacceptable but illegal. The clearer that is in the Bill, the more it will do something to future-proof the direction in which regulators will have to travel. If we are clear about what the outcomes need to be in terms of the welfare, well-being and mental health of children, that will give us some guidelines to work within as the world evolves so quickly.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.

As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.

My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.

I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.

In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.

We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.

This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.

I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.

Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.

I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.

Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of

“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”

without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.

16:45
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, for their amendments and consideration of this policy area. I hope noble Lords will bear with me if I save some of the points I shall make on web crawling and intellectual property for the later group, which is specifically on that topic.

Amendments 92 and 93 from the noble Viscount are about the new disproportionate effort exemption in Article 13. I can reassure noble Lords that this exemption applies only when data is collected directly from the data subject, so it cannot be used for web crawling, which is, if you like, a secondary activity. I think that answers that concern.

Amendments 101 and 105, also from the noble Viscount, are about the changes to the existing exemption in Article 14, where data is collected from other sources. Noble Lords debated this issue in the previous group, where Amendments 97 and 99 sought to remove this exemption. The reassurances I provided to noble Lords in that debate about the proportionality test being a case-by-case exercise also apply here. Disproportionate effort cannot be used as an excuse; developers must consider the rights of the data subject on each occasion.

I also draw noble Lords’ attention to another quote from the ICO itself, made when publishing its recent outcome reports. I know I have already said that I will share more information on this. It says:

“Generative AI developers, it’s time to tell people how you’re using their information”.


The ICO is on the case on this issue, and is pursuing it.

On Amendment 137 from the noble Baronesses, Lady Kidron and Lady Harding, and other noble Lords, I fully recognise the importance of organisations receiving clear guidance from regulators, especially on complex and technical issues. AI is one such issue. I know that noble Lords are particularly conscious of how it might affect children, and I am hearing the messages about that today.

As the noble Baroness will know, the Secretary of State already has the power to request statutory codes such as this from the regulator. The existing power will allow us to ensure the correct scope of any future codes, working closely with the ICO and stakeholders and including noble Lords here today, and I am happy to meet them to discuss this further. The Government are, naturally, open to evidence about whether new statutory codes should be provided for by regulations in future. Although I appreciate the signal this can send, at the moment I do not believe that a requirement for codes on this issue is needed in this legislation. I hope noble Lords are reassured that the Government are taking this issue seriously.

Amendment 211A from the noble Lord, Lord Holmes, is about prohibiting the processing of people’s names, facial images, voices or any physical characteristics for AI training without their consent. Facial images and other physical characteristics that can be used to identify a person are already protected by the data protection legislation. An AI developer processing such data would have to identify a lawful ground for this. Consent is not the only option available, but I can reassure the noble Lord that there are firm safeguards in place for all the lawful grounds. These include, among many other things, making sure that the processing is fair and transparent. Noble Lords will know that even more stringent conditions, such as safeguards applying in relation to race, sexual orientation and any biometric data that can be used to identify someone as types of a special category of data are also covered.

Noble Lords tried to tempt me once again on the timetable for the AI legislation. I said as much as I could on that when we debated this in the last session, so I cannot add any more at this stage.

I hope that reassures noble Lords that the Bill has strong protections in place to ensure responsible data use and reuse, and, as such, that they feel content not to press their amendments.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I understand the point that the Secretary of State has the power, but does he have the intention? We are seeking an instruction to the ICO to do exactly this thing. The Secretary of State’s intention would be an excellent compromise all round to activate such a thing, and to see that in the Bill is the point here.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Discussions with the ICO are taking place at the moment about the scope and intention of a number of issues around AI, and this issue would be included in that. However, I cannot say at the moment that that intention is specifically spelled out in the way that the noble Baroness is asking.

Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

This has been a wide-ranging debate, with important contributions from across the Committee. I take some comfort from the Minister’s declaration that the exemptions will not be used for web crawling, but I want to make sure that they are not used at the expense of the privacy and control of personal data belonging to the people of Britain.

That seems particularly so for Amendment 137 in the name of the noble Baroness, Lady Kidron. I was particularly taken by her pointing out that children’s data privacy had not been taken into account when it came to AI, reinforced by the noble Baroness, Lady Harding, telling us about the importance of the Bill. She said it was paramount to protect children in the digital age and reminded us that this is the biggest breakthrough of our lifetime and that children need protecting from it. I hope very much that there will be some successful meetings, and maybe a government amendment on Report, responding to these passionate and heartfelt demands. On that basis, I sincerely hope the Minister will meet us all and other noble Lords to discuss these matters of data privacy further. On that basis, I beg leave to withdraw my amendment.

Amendment 92 withdrawn.
Amendments 93 and 94 not moved.
Amendment 95
Moved by
95: Clause 77, page 91, line 16, leave out “to the extent that” and insert “when any one or more of the following is true”
Member’s explanatory statement
This amendment would clarify that only one condition under paragraph 5 must be present for paragraphs 1 to 4 to not apply.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, I was in such a hurry to apologise just now for missing Second Reading that I forgot to declare my interests and remind the Committee of my technology and, with regard to this group, charitable interests as set out in the register.

I shall speak to Amendments 95, 96, 98, 101, 102 and 104 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson of Balmacara, and my noble friend Lord Black of Brentwood, and Amendments 103 and 106 in my name and those of the noble Lords, Lord Clement-Jones and Lord Stevenson. I also support Amendment 162 in the name of the noble Lord, Lord Clement-Jones. I will speak only on the marketing amendments in my name and leave the noble Lord, Lord Clement-Jones, to do, I am sure, great justice to the charitable soft opt-in.

These amendments are nothing like as philosophical and emotive as the last amendment on children and AI. They aim to address a practical issue that we debated in the late spring on the Data Protection and Digital Information Bill. I will not rehearse the arguments that we made, not least because the Minister was the co-signatory of those amendments, so I know she is well versed in them.

Instead, I shall update the Committee on what has happened since then and draw noble Lords’ attention to a couple of the issues that are very real and present now. It is strange that all Governments seem reluctant to restrict the new technology companies’ use of our data but extremely keen to get into the micro detail of restricting older forms of our using data that we have all got quite used to.

That is very much the case for the open electoral register. Some 63% of people opt out of being marketed at, because they have put their name as such on the electoral register. This is a well known and well understood use of personal data. Yet, because of the tribunal ruling, it is increasingly the case that companies cannot use the open electoral register and target the 37% of people who have said that they are quite happy to receive marketing unless the company lets every single one of those users know that they are about to market to them. The danger is that we create a new cookie problem—a physical cookie problem—where, if you want to use a data source that has been commonplace for 40 years, you have to send some marketing to tell people that you are about to use it. That of course means that you will not do so, which means that you reduce the data available to a lot of small and medium-sized businesses to market their products and hand them straight to the very big tech companies, which are really happy to scrape our data all over the place.

This is a strange one, where I find myself arguing that we should just allow something that is not broken not to need to be fixed. I appreciate that the Minister will probably tell us that the wording in these amendments is not appropriate. As I said earlier in the year—in April, in the previous incarnation—I very much hope that if the wording is incorrect we could, between Committee and Report, have a discussion and agree on some wording that achieves what seems just practical common sense.

The tribunal ruling that created this problem recognised that it was causing a problem. It stated that it accepted that the loophole it created would allow one company, Experian, a sizeable competitive advantage. It is a slightly perverse one: it means that it has to let only 5 million people know that it might be about to use the open electoral register, while its competitors have to let 22 million people know. That just does not pass the common-sense test of practical use of data. Given the prior support that the Minister has shown for this issue, I very much hope that we can resolve it between Committee and Report. I beg to move.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have a couple of amendments in this group, Amendments 158 and 161. Amendment 158 is largely self-evident; it tries to make sure that, where there is a legal requirement to communicate, that communication is not obstructed by the Bill. I would say much the same of Amendment 161; that, again, it is obvious that there ought to be easy communication where a person’s pension is concerned and the Bill should not obstruct it. I am not saying that these are the only ways to achieve these things, but they should be achieved.

I declare an interest on Amendment 160, in that I control the website of the Good Schools Guide, which has advertising on it. The function of advertising on the web is to enable people to see things for free. It is why it does not close down to a subscription-only service. If people put advertisements on the web, they want to know that they are effective and have been seen, and some information about who they have been seen by. I moved a similar amendment to the previous Government’s Bill and encountered some difficulty. If the Government are of the same mind—that this requires us to be careful—I would very much welcome the opportunity of a meeting between now and Report, and I imagine others would too, to try to understand how best to make sure that advertising can flourish on the internet.

17:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.

The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.

I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.

Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.

Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.

Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.

This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.

Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.

This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.

My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?

Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:

“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,


based on analysis of 13.1 million donors by the Salocin Group. The letter continues:

“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.


I hope that the Government will listen to the DMA and the charities involved.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.

Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.

Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.

Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.

On Amendment 102, again, when it comes to providing information to them,

“the damage and distress to the data subjects”

is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.

First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.

Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.

17:15
Amendment 158 from the noble Lord, Lord Lucas, seeks to amend the definition of “direct marketing”, to make it clear that it excludes communications necessary to avoid harm or improve consumer outcomes, when complying with law or regulatory standards.  I understand the sentiment behind the amendment, but financial services firms can already provide regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content.  While such messages can support customers to make informed decisions about their financial investments, they would not be classed as advertising or marketing material. As such, they would not engage the direct marketing rules within the Privacy and Electronic Communications Regulations. I refer the noble Lord to paragraph 803 of the Explanatory Notes to the Bill, where we have taken steps to clarify that position.
Amendment 159A from the noble Viscount, Lord Camrose, is aimed at enabling cookie paywalls. As we have identified, conversely, Amendment 159 from the noble Lord, Lord Clement-Jones, seeks to ban their use. Generally, these paywalls work by giving web users the option to pay for a cookie-free browsing experience. Earlier this year the Information Commissioner launched a call for views on “consent or pay” models for cookies. The aim of the Information Commissioner’s call for views is to provide the online advertising industry with clarity on how advertising cookies and paywalls can be used in compliance with data protection and privacy laws. We will consider the Information Commissioner’s findings when he publishes his response to this call for views. It would be premature to make legal changes without considering the findings or consulting interested parties. I hope noble Lords will bear that in mind.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

When does the Minister anticipate that the ICO will produce that report?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.

I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.

Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.

The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.

Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.

As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.

Baroness Stowell of Beeston Portrait Baroness Stowell of Beeston (Con)
- Hansard - - - Excerpts

I apologise to the Minister for intervening on her when I have not spoken earlier in this debate, but I was reassured by what she just said on Amendment 162. Remarks made by other noble Lords in this debate suggest both that members of the public might not object to charities having the same access rights as businesses and that the public do not necessarily draw a distinction between businesses and charities. As a former chairman of the Charity Commission, I can say that that is not what is generally found. People have an expectation of charities that differs from what they would expect by way of marketing from businesses. In considering this amendment, therefore, I urge the Minister to think carefully before deciding what action the Government should take.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Baroness very much for that very helpful intervention. If she has any more information about the view of the Charity Commission, we would obviously like to engage with that because we need to get this right. We want to make sure that individuals welcome and appreciate the information given to them, rather than it being something that could have a negative impact.

I think I have covered all the issues. I hope those explanations have been of some reassurance to noble Lords and that, as such, they are content not to press their amendments.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

May I just follow up by asking one quick question? I may be clutching at straws here but, in responding to the amendments in my name, she stated what the ICO believes rather than what the Government believe. She also said that the ICO may think that further permission is required to ensure transparency. I understand from the Data & Marketing Association that users of this data have four different ways of ensuring transparency. Would the Minister agree to a follow-up meeting to see whether there is a meeting of minds with what the Government think, rather than the ICO?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am very happy to talk to the noble Baroness about this issue. She asked what the Government’s view is; we are listening very carefully to the Information Commissioner and the advice that he is putting together on this issue.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I am very grateful for the answers the noble Baroness gave to my amendments. I will study carefully what she said in Hansard, and if I have anything further to ask, I will write to her.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

My Lords, in response—and very briefly, given the technical nature of all these amendments—I think that we should just note that there are a number of different issues in this group, all of which I think noble Lords in this debate will want to follow up. I thank the many noble Lords who have contributed both this time round and in the previous iterations, and ask that we follow up on each of the different issues, probably separately rather than in one group, as we will get ourselves quite tangled in the web of data if we are not careful. With that, I beg leave to withdraw the amendment.

Amendment 95 withdrawn.
Amendments 96 to 106 not moved.
Clause 77 agreed.
Clause 78 agreed.
Amendment 107 not moved.
Clause 79: Data subjects’ rights to information: legal professional privilege exemption
Amendment 108
Moved by
108: Clause 79, page 93, line 18, leave out “court” and insert “tribunal”
Member’s explanatory statement
This amendment is consequential on the new Clause (Transfer of jurisdiction of courts to tribunals).
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in moving Amendment 108, I will also speak to all the other amendments in this group. They are all designed to transfer all existing provisions from the courts to the tribunals and simplify the enforcement of data rights. Is that not something to be desired? This is not just a procedural change but a necessary reform to ensure that the rights granted on paper translate into enforceable rights in reality.

The motivation for these amendments stems from recurring issues highlighted in cases such as Killock and Veale v the Information Commissioner, and Delo v the Information Commissioner. These cases revealed a troubling scenario where the commissioner presented contradictory positions across different levels of the judiciary, exacerbating the confusion and undermining the credibility of the regulatory framework governing data protection. In these cases, the courts have consistently pointed out the confusing division of jurisdiction between different courts and tribunals, which not only complicates the legal process but wastes considerable public resources. As it stands, individuals often face the daunting task of determining the correct legal venue for their claims, a challenge that has proved insurmountable for many, leading to denied justice and unenforced rights.

By transferring all data protection provisions from the courts to more specialised tribunals, which are better equipped to handle such cases, and clarifying the right-to-appeal decisions made by the commissioner, these amendments seek to eliminate unnecessary legal barriers. Many individuals, often representing themselves and lacking legal expertise, face the daunting challenge of navigating complex legal landscapes, deterred by high legal costs and the intricate determination of appropriate venues for their claims. This shift will not only reduce the financial burden on individuals but enhance the efficiency and effectiveness of the judicial process concerning data protection. By simplifying the legal landscape, we can safeguard individual rights more effectively and foster a more trustworthy digital environment.

17:30
The proposed changes are a crucial step towards aligning our legal framework with the realities of modern data use and ensuring that everyone can genuinely protect their data rights. I previously introduced similar amendments during the debate on the now-defunct DPDI Bill. They addressed the persistent jurisdictional confusion embedded in the Data Protection Act 2018—a confusion that has significantly hindered individuals’ ability to enforce their data protection rights effectively.
Additionally, these amendments clarify the right to appeal decisions made by the commissioner, touching directly on the core issues raised in the Killock case. On any basis, given the insightful postscript by Mrs Justice Farbey in Killock, it is clear that a comprehensive review of the appeal mechanisms for rights under the DPA is long overdue. Such a review would streamline processes, conserve judicial resources and, most importantly, make it easier for individuals to enforce their data protection rights. I beg to move.
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.

The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.

Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.

I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.

I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.

On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.

These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.

The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.

As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.

On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.

Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for his response to my amendments and welcome him to the Dispatch Box and a whole world of pain on the Data (Use and Access) Bill, as he has, no doubt, noted already after just two hours’ worth of this Committee.

I found his response disappointing, and I think both he and the noble Viscount, Lord Camrose, have misunderstood the nature of this situation. This is not a blend, which is all beautifully logical depending on the nature of the case. This is an absolute mishmash where the ordinary litigant is faced with great confusion, not knowing quite often whether to go to the court or a tribunal, where the judges themselves have criticised the confusion and where there appears to be no appetite, for some reason, in government for a review of the jurisdictions.

I felt that the noble Viscount was probably reading from his previous ministerial brief. Perhaps he looked back at Hansard for what he said on the DPDI Bill. It certainly sounded like that. The idea that the courts are peerless in their legal interpretation and the poor old tribunals really just do not know what they are doing is wrong. They are expert tribunals, you can appear before them in person and there are no fees. It is far easier to access a tribunal than a court and certainly, as far as appeals are concerned, the idea that the ordinary punter is going to take judicial review proceedings, which seems to be the implication of staying with the current system on appeals if the merits of the ICO’s decisions are to examined, seems quite breathtaking. I know from legal practice that JR is not cheap. Appearing before a tribunal and using that as an appeal mechanism would seem far preferable.

I will keep on pressing this because it seems to me that at the very least the Government need to examine the situation to have a look at what the real objections are to the jurisdictional confusion and the impact on data subjects who wish to challenge decisions. In the meantime, I beg leave to withdraw the amendment.

Amendment 108 withdrawn.
Clause 79 agreed.
Amendments 109 and 109A not moved.
Clause 80: Automated decision-making
Amendment 110
Moved by
110: Clause 80, page 94, line 24, at end insert—
“3. To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”Member's explanatory statement
This amendment would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I beg to move Amendment 110 and will speak to Amendments 112, 114, 120, 121, 122, 123 and Clause 80 stand part. As we have heard, artificial intelligence and algorithmic and automated decision-making tools, are increasingly being used across the public sector to make and support many of the highest impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life.



The Committee will be pleased to hear that I will not repeat the contents of my speech on my Private Member’s Bill on this subject last Friday. But the fact remains that the rapid adoption of AI in the public sector presents significant risks and challenges, including: the potential for unfairness, discrimination and misuse, as demonstrated by scandals such as the UK’s Horizon and Australia’s Robodebt cases; automated decisions that are prone to serious error; lack of transparency and accountability in automated decision-making processes; privacy and data protection concerns; algorithmic bias; and the need for human oversight.

17:45
To counter this, on Friday the Government prayed in aid the algorithmic transparency standard and the GDPR, but it appears that they are intent on watering down the GDPR Article 22 provisions with this Bill. As I said then, as Governments continue to adopt AI technologies, it is crucial to balance the potential benefits with the need for responsible and ethical implementation to ensure fairness, transparency and public trust. Many of us putting forward amendments today are very much on the same page in wanting to improve safeguards, by contrast.
I hope that the Minister will recognise Amendments 110 and 112 as amendments that she tabled to the DPDI Bill. I am glad to see that they are now supported by the noble Lord, Lord Knight. Amendment 110 would make it clear that, in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person. Amendment 112, which would amend new Article 22B of the UK GDPR, aims to make it clear that data processing that contravenes any part of the Equality Act 2010 is prohibited.
Amendment 114 would expand the scope of Article 22 to predominantly automated decision-making in line with, while not as radical as, the recommendation formulated by the Information Commissioner’s Office in its response to the Data: A New Direction consultation in 2021. I support and have signed Amendment 119 in the name of the noble Viscount, Lord Colville, which would mandate the ATRS in government, as would my Amendment 121.
To address these challenges, several measures are also contained in Amendments 120 and 122. Amendment 120 would require public authorities to be responsible for completing an algorithmic impact assessment, the format of which would be prescribed in regulations, prior to the deployment of an algorithmic or automated decision-making system. Amendment 122, which is similar to Amendment 123A in the name of the noble Lord, Lord Holmes, would require public authorities to set up a comprehensive, publicly accessible register of all ADM systems used by public authorities, enabling scrutiny and providing transparency on the rationale and functionality of ADM systems, including information on human oversight. Individuals affected by decisions made by ADM systems would have the right to receive a meaningful and personalised explanation of how a decision was reached, including information about the decision-making process.
We very much support Amendment 123B in the name of the noble Lord, Lord Holmes. It would require employees involved in using ADM systems to have the capabilities to challenge system outputs, understand potential risks and enable oversight in line with OECD principles.
Clause 80 introduces a provision inherited from the previous DPDI Bill for the Secretary of State to use regulations to define what constitutes “meaningful human involvement” for the purposes of paragraph 1(a) of new Article 22A and whether a decision is or is not to be taken to have
“a similarly significant effect for the data subject”.
Both these terms are critical in defining the scope of Article 22 protections. These terms have also been the subject of significant uncertainty and debate, due to limited existing case law. What constitutes meaningful human involvement raises important questions around the impact of the automation bias, opacity, competence and authority of the human involved. What constitutes a similarly significant effect engages important questions, for example, about how the law applies to decision processes with multiple stages.
The only mechanism for clarifying these terms in the Bill is the power vested in the Secretary of State to define them in the context of data protection and automated decision-making. These are not merely technical changes: they represent significant policy decisions that go to the heart of the Bill and therefore require sufficient parliamentary oversight. Amendment 123 would require the Secretary of State, in conjunction with the ICO, to develop guidance on the interpretation of the safeguards in Article 22C and on important terms, such as “similarly significant effect” and “meaningful human involvement”. As the dedicated regulator, the ICO is best placed and equipped to advise and ensure consistency of application. The required timeline for publishing the guidance is six months after Royal Assent.
I very much support Amendment 115 in the name of the noble Lord, Lord Lucas. It is notable that the noble Lords, Lord Knight and Lord Holmes, both highlighted the risk of AI in employment decisions at Second Reading. At the end of the day, however, Clause 80 and the changes to Article 22 will not wash. It removes important protections for automated decision-making and AI; that position is supported by a great number of civil society organisations, such as Big Brother Watch, the Ada Lovelace Institute, Connected by Data, Defend Digital Me, Liberty, the Open Rights Group, Privacy International, the Public Law Project and Worker Info Exchange.
Article 22 of the GDPR enshrines the right not to be subject to a decision based on solely automated processing that has legal or otherwise significant effects on the individual concerned. This has proven to be a highly effective right that protects individuals from harmful decisions and discrimination. However, Clause 80 of this Bill would deprive individuals of this important right in most circumstances and would exacerbate power imbalances by requiring individuals to scrutinise, contest and assert their rights against decisions taken by systems outside their control.
I have not even talked yet about the impact of Clauses 82 and 83. In the context of law enforcement processing, the potential for people’s rights and liberties to be infringed by automated processing is extremely serious. As such, ADM involving sensitive personal data could be used in UK policing. Further diluted safeguards apply under proposed new Section 50C(3), to be inserted by Clause 80(3), whereby, rather than explicitly requiring the data controller to notify an affected individual—as is currently the case under Section 50(2)(a) of the Data Protection Act 2018—they must merely create measures to provide information about the ADM and enable the subject to contest the decision.
There are no provisions for any course of action after such secret ADM decisions are made—not even if, for example, the human review finds that an automated decision was wrong. It is extremely concerning that any ADM about a person can take place without their right to know, but for it to be conducted by police in secret and in a way that detrimentally impacts their life is an affront to justice and likely to interfere with any number of an individual’s rights.
Clause 84 would amend Sections 96 and 97 of the Data Protection Act 2018 to change the definition of ADM in the context of intelligence services processing. I very much hope that the Government will reconsider. I hope that, if they will not listen to me, they will listen to what civil society organisations have to say. In their letter of 6 September to the Secretary of State, co-ordinated by the Open Rights Group, they said:
“We recognise that there are benefits to be gained from Artificial Intelligence … Yet there are concerns. Data can be biased. Models can be wrong. The potential for discrimination and for deepening inequalities is known and significant. Important machine decisions can be wrong and unjust, and frequently Artificial Intelligence providers are unwilling or unable to address shortcomings … We respectfully ask that these clauses be re-examined to ensure that people are not simply subjected to life changing decisions made solely by machines, and forced to prove their innocence when machines get it wrong. The government should extend AI accountability, rather than reduce it, at this critical moment”.
Finally, on Amendment 123C, research by the Institute for the Future of Work suggests that the utility and effectiveness of data protection impact assessments are limited by the absence of basic disclosure provisions and strict limitations to their application to data subjects and data rights. In particular, significant social and economic impacts on workers, workplace and labour rights are likely to fall between protection in data and employment legislation as they stand. Areas of concern include hiring and access to work; pay and work allocation; impacts on the conditions and quality of work; monitoring and surveillance, including neuro and emotional surveillance; and discipline or termination of work. Research shows that AI and other data-driven technologies have already had significant impacts on the nature of work and jobs, on the conditions and quality of people’s work, and on access and enforceability of rights.
Amendment 123C adopts the language of the Institute for the Future of Work about automation archetypes. These have been developed as part of the Nuffield Foundation supported Pissarides Review into the Future of Work and Wellbeing, which will be published in January 2025, and they challenge our understanding and narratives about automation, its potential and the choices that we make now to shape our futures. It is not enough to rely on the enforcement of individual rights in discrete domains, after the event. Pre-emptive assessment of significant impacts and establishing a process for ongoing monitoring and intervention are necessary. This is in line with the Council of Europe’s framework convention on AI, signed in September and to which the UK is a signatory. The Council of Europe’s committee on AI has just officially adopted the HUDERIA human rights algorithmic impact assessment.
The amendment could lead to the introduction of measures to ensure private sector assessment and monitoring of impacts on work, people and fundamental rights, which would conform to the framework convention. If this does not do what the Government intend as regards adoption in the UK of that framework convention, I very much hope that the Government can give us more information about that at this time. I beg to move.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, Amendment 119 is in my name, and I thank the noble Lord, Lord Knight, for adding his name to it. I am pleased to add my name to Amendment 115A in the name of noble Viscount, Lord Camrose.

Transparency is key to ensuring that the rollout of ADM brings the public and, most importantly, public trust with it. I give the Committee an example of how a lack of transparency can erode that trust. The DWP is using a machine learning model to analyse all applications for a loan, as an advance on a benefit to pay bills and other costs, while a recipient waits for their first universal credit payment. The DWP’s own analysis of the model concluded that for all of the protected characteristics that were analysed, including age, marital status and disability, it found disparities in who was most likely to be incorrectly referred by the model.

It is difficult to assess whether the model is discriminatory, effective or even lawful. When the DWP rolled it out, it was unable to reassure the Comptroller and Auditor-General that its anti-fraud models treated all customer groups fairly. The rollout continues despite these concerns. The DWP maintains that the analysis does not present

“any immediate concerns of discrimination, unfair treatment or detrimental impact on customers”.

However, because so little information is available about the model, this claim cannot be independently verified to provide the public with confidence. Civil rights organisations, including the Public Law Project, are currently working on a potential claim against the DWP, including in relation to this model, on the basis that they may consider it may be unlawful.

The Government’s commitment to rolling out ADM has been accompanied by a statement in the other place in November by AI Minister Feryal Clark that the mandatory requirement for the use of the ATRS has been seen as a significant acceleration towards adopting the standard. In response to a Written Question, the Secretary of State confirmed that, as part of the rollout of ADM phase 1 to the 16 largest ministerial departments plus HMRC, there is a deadline for them to publish their first ATRS records by the end of July 2024. Despite the Government’s statement, only eight ATRS reports have been published on the hub. The Public Law Project’s TAG project has discovered at least 74 areas in which ADM is being used, and they are only the ones that it has been able to uncover by freedom of information requests and from tip-offs by affected people. There is clearly a shortfall in the implementation and rolling out of the use of the ATRS across government departments.

18:00
Amendment 119 does not demand that these standards should be put in the Bill but gives the Government the option to introduce regulations should the rollout of the ATRS continue to be so slow. The need for these measures to be required by the Bill is clear from the slow rollout in other countries, such as Canada, where the standard was introduced but its implementation has been very slow. Canada introduced a non-statutory requirement for disclosing ADM models, which was then enforced by an internal government review. However, this process had not been effective. For May 2024, only 21 ADM models were disclosed, whereas more than 300 models are in use by the Canadian Government, according to the Starling Centre, a non-governmental research organisation.
This amendment would give flexibility in other ways. Subsection (2) recognises that the standard for ATRS creation and publication might well change and that, as its use becomes more common, the Government might want to tweak it. Likewise, subsection (3) allows flexibly in regulations about the manner of publication.
I recognise that transparency cannot apply to all collection of public data. For instance, nobody would want to influence a fraud inquiry by publishing data that might affect the outcome of that inquiry. However, I remind the Minister that when this was discussed on the Data Protection and Digital Information Bill, she tabled Amendment 74 in Committee, calling for the insertion of the mandatory use of the ATRS, almost exactly along the lines of my amendment. In the debate that followed, the noble Lord, Lord Bassam, said that putting the ATRS on a statutory footing would be,
“key to securing trust in what will be something of a revolution in how public services are delivered and procured in the future”.—[Official Report, 27/3/24; col. GC 214.]
Stephanie Peacock, then the Labour spokesman for the Bill in the other place, tabled a similar amendment in Committee in which she said,
“Relying on self-regulation in the early stages of the scheme is understandable, but having conducted successful pilots, from the Cabinet Office to West Midlands police, it is unclear why the Government now choose not to commit to the very standard they created”.—[Official Report, Commons, Data Protection and Digital Information (No. 2) Bill Committee, 23/5/23; col. 284.]
If the Minister’s party had those concerns then, why would they not be relevant now, especially as it is becoming clear that transparency is not accompanying the rollout of ADM across the public sector?
I have also added my name to Amendment 115A, which aims to delete the regulations in new Article 22D. As the noble Viscount, Lord Camrose, will no doubt explain, there is a danger of mission creep with the rollout of ADM. The key concern is that the Secretary of State could, through secondary legislation, water down what counts as meaningful human involvement. My fear is that this would allow decision-makers to bypass the need to comply with safeguards in new Articles 22A to 22C by having a nominal human in the loop, even if that human was not in a position to be an effective safeguard. This could be because they were not sufficiently competent; they were not allowed enough time properly to revise a decision; they were influenced by automation bias, in which human decision-makers are unduly influenced by the recommendations of the machine; or, because of the black box nature of the algorithm, they were not in a position to understand it.
There is a good example of the concern about mission creep in a recent case in the Netherlands. There was a successful challenge by Uber drivers against the firm’s “robo-firing” system, where drivers faced allegations of fraudulent activity determined by a machine and were dismissed without appeal. Although there was some human involvement in the process, the Dutch court found it was
“not … much more than a purely symbolic act”,
noting that Uber had failed to make clear
“what the qualifications and level of knowledge”
of the people involved were. It therefore concluded that there was not sufficient evidence of “meaningful human intervention”, so the system was caught by Article 22. The concern would be that if the Secretary of State were to legislate to declare that this kind of human intervention was in fact sufficient, it would deny British people the protections their European counterparts have under the EU GDPR. As for the definition of “similarly significant” safeguards in regulations, this allows for slippage which would harm individuals.
As useful as ADM is for promoting efficient government, people are afraid of it. They do not necessarily trust the Government, and many are worried by the Government using algorithms to make important decisions that affect their lives. If the Government intend to roll out ADM across the public sector, as they promise, then it is essential to do everything possible along the way to nurture trust with the public. These amendments would go some way to doing that.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, my Amendment 115 would similarly act in that way by making automated decision-making processes explain themselves to the people affected by them. This would be a much better way of controlling the quality of what is going on with automated decision-making than restricting that sort of information to professionals—to people who are anyway overworked and have a lot of other things to do. There is no one more interested in the decision of an automated process than the person about whom it is being made. If we are to trust these systems then their ability, which is way beyond the human ability, to have the time to explain why they took the decision they did—which, if the machine is any good, it knows and can easily set out—is surely the way to generate trust: you can absolutely see what decision has been made and why, and you can respond to it.

This would, beyond anything else, produce a much better system for our young people when they apply for their first job. My daughter’s friends in that position are getting into the hundreds of unexplained rejections. This is not a good way to treat young people. It does not help them to improve and understand what is going on. I completely understand why firms do not explain; they have so many applications that they just do not have the time or the personnel to sit down and write a response—but that does not apply to an automated decision-making machine. It could produce a much better situation when it comes to hiring.

As I said, my principal concern, to echo that of the noble Viscount, is that it would give us sight of the decisions that have been taken and why. If it becomes evident that they are taken well and for good reasons, we shall learn to trust them. If it becomes evident that they really are not fair or understandable, we shall be in a position to demand changes.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in the debate on this group. I support the spirit of all the amendments debated thus far.

Speaking of spirits, and it being the season, I have more than a degree of sympathy for the Minister. With so many references to her previous work, this Christmas is turning into a bit of the Ghost of Amendments Past for her. That is good, because all the amendments she put down in the past were of an excellent quality, well thought through, equally considered and even-handed.

As has been mentioned many times, we have had three versions of a data Bill so far over just over three years. One wonders whether all the elements of this current draft have kept up with what has happened in the outside world over those three years, not least when it comes to artificial intelligence. This goes to the heart of the amendments in this group on automated decision-making.

When the first of these data Bills emerged, ADM was present—but relatively discreetly present—in our society and our economy. Now it would be fair to say that it proliferates across many areas of our economy and our society, often in situations where people find themselves at the sharpest end of the economy and the sharpest end of these automated decisions, often without even knowing that ADM was present. More than that, even on the discovery that ADM was in the mix, depending on which sector of the economy or society they find that decision being made in, they may find themselves with no or precious little redress—employment and recruitment, to name but one sector.

It being the season, it is high time when it comes to ADM that we start to talk turkey. In all the comments thus far, we are talking not just about ADM but about the principles that should underpin all elements of artificial intelligence—that is, they should be human led. These technologies should be in our human hands, with our human values feeding into human oversight: human in the loop and indeed, where appropriate, human over the loop.

That goes to elements in my two amendments in this group, Amendments 123A and 123B. Amendment 123A simply posits, through a number of paragraphs, the point that if someone is subject to an automated decision then they have the right to a personalised explanation of that decision. That explanation should be accessible in its being in plain language of their choice, not having a cost attached to it and not being in any sense technically or technologically convoluted or opaque. That would be relatively straightforward to achieve, but the positive impact for all those citizens would certainly be more than material.

Amendment 123B goes to the heart of those humans charged with the delivery of these personalised explanations. It is not enough to simply say that there are individuals within an organisation responsible for the provision of personalised explanations for automated decisions; it is critical that those individuals have the training, the capabilities and, perhaps most importantly, the authority within that organisation to make a meaningful impact regarding those personalised explanations. If not, this measure may have a small voice but would have absolutely no teeth when it comes to the citizen.

In short, ADM is proliferating so we need to ensure that we have a symmetrical situation for citizens, for consumers, and for anyone who finds themselves in any domain or sector of our economy and society. We must assert the principles: human-led, human in the loop, “Our decisions, our data”, and “We determine, we decide, we choose”. That is how I believe we can have an effective, positive, enabling and empowering AI future. I look forward to the Minister’s comments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I shall speak to the series of amendments on automated decision-making to which I have added my name but are mostly in the name of the noble Lord, Lord Clement-Jones. As he said, we had a rehearsal for this debate last Friday when we debated his Private Member’s Bill so I will not delay the Committee by saying much about the generalities of ADMs in the public sector.

Suffice it to say that human involvement in overseeing AIs must be meaningful—for example, without those humans themselves being managed by algorithms. We must ensure that ADMs comply by design with the Equality Act and safeguard data subjects’ other rights and freedoms. As discussed in earlier groups, we must pay particular attention to children’s rights with regard to ADMs, and we must reinforce the obligation on public bodies to use the algorithmic transparency recording standards. I also counsel my noble friend the Minister that, as we have heard, there are many voices from civil society advising me and others that the new Article 22 of the GDPR takes us backwards in terms of protection.

That said, I want to focus on Amendment 123C, relating to ADMs in the workplace, to which I was too late to add my name but would have done. This amendment follows a series of probing amendments tabled by me to the former DPDI Bill. In this, I am informed by my work as the co-chair of the All-Party Parliamentary Group on the Future of Work, assisted by the Institute for the Future of Work. These amendments were also mirrored during the passage of the Procurement Act and competition Act to signal the importance of the workplace, and in particular good work, as a cross-cutting objective and lens for policy orientation.

18:15
I am pleased to note that in each case, progress was made. In particular, the CMA, following my probing amendment in January, has initiated some potentially world-leading new labour market investigations, extending to the secondary, often hidden, impacts of concentration associated with the digital giants. I am not saying that there is necessarily a causation attached to my amendments and those investigations, but we need more of this sort of work—and I am hoping to continue my winning streak.
At a time of sluggish economic recovery, the UK will benefit from a more cohesive, future-oriented approach to policy-making aimed at supporting transitions and building the capabilities of people and institutions to support pro-human and pro-innovation automation: that is, making the most of human as well as technological capabilities.
There is a significant risk that some significant impacts on people or groups may get lost if the employment, AI and data Bills are not triangulated. Good and effective employment protection must cover transitions as well as hire and fire. I am grateful to techUK for its briefing on these amendments. It broadly supports the Bill and the use of ADMs for things such as faster logging into systems and personalisation, but it refers to a risk-based approach and references employment decisions as being higher risk and needing special attention.
As this amendment argues, we need the introduction of additional principles, thresholds and requirements for the high-risk environment of work. By giving these basic protections, we free people to innovate, including around the use of ADMs. This is increasingly important to recognise in the world of large language models, when new types of automation, new builds and new risks emerge. Automation is not just about displacement but about different types of work, different skills, and different ways of people interacting with technology and imagining different possibilities for the future. There is increasing evidence on this, and if the Minister is willing to meet with me and other members of the All-Party Group on the Future of Work, we can help the UK potentially to develop a gold-standard, world-leading, evidence-driven model for reflexive, context-sensitive, pre-emptive regulation in the workplace and beyond it.
I would want to see algorithmic impact assessments that cover significant impacts on work and workers, such as any impact on equal opportunities or outcomes at work, access to employment, pay, contractual status, terms and conditions of employment, health, lawful association, rights and training. Assessments should also be on an ongoing rather than a snapshot basis, involve those affected, including official representatives, in a proportionate way, and should disclose metrics and methods and be developed by regulators at both a domain and a sector level. I could go on, but I look forward to the Minister’s response.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 114 to which I have added my name. It is a very simple amendment that prevents controllers circumventing the duties for automated decision-making by adding trivial human elements to avoid the designation. So, as such, it is a very straightforward—and, I would have thought, uncontroversial—amendment. I really hope that the Government will find something in all our amendments to accept, and perhaps that is one such thing.

I am struck that previous speeches have referred to questions that I raised last week: what is the Bill for, who is it for and why is not dealing with a host of overlapping issues that cannot really be extrapolated one from another? In general, a bit like the noble Lord, Lord Holmes, I am very much with the spirit of all these amendments. They reflect the view of the Committee and the huge feeling of civil society—and many lawyers—that this sort of attack on Article 22 by Clause 80 downgrades UK data rights at a time when we do not understand the Government’s future plans and hear very little about protections. We hear about the excitements of AI, which I feel bound to say that we all share, but not at the expense of individuals.

I raise one last point in this group. I had hoped that the Minister would have indicated the Government’s openness to Amendment 88 last week, which proposed an overarching duty on controllers and processors to provide children with heightened protections. That seemed to me the most straightforward mechanism for ensuring that current standards were maintained and then threaded through new situations and technologies as they emerged. I put those two overarching amendments down on the understanding that Labour, when in opposition, was very much for this approach to children. We may need to bring back specific amendments, as we did throughout the Data Protection and Digital Information Bill, including Amendment 46 to that Bill, which sought to ensure

“that significant decisions that impact children cannot be made using automated processes unless they are in a child’s best interest”.

If the Minister does not support an overarching provision, can she indicate whether the Government would be more open to clause-specific carve-outs to protect children and uphold their rights?

Lord Thomas of Cwmgiedd Portrait Lord Thomas of Cwmgiedd (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly, first, to thank everyone who has spoken so eloquently about the importance of automated decision-making, in particular its importance to public trust and the importance of human intervention. The retrograde step of watering down Article 22 is to be deplored. I am therefore grateful to the noble Lord, Lord Clement-Jones, for putting forward that this part of the Bill should not stand part. Secondly, the specific amendment that I have laid seeks to retain the broader application of human intervention for automated decision-making where it is important. I can see no justification for that watering down, particularly when there is such uncertainty about the scope that AI may bring to what can be done by automated decision-making.

Lord Kamall Portrait Lord Kamall (Con)
- Hansard - - - Excerpts

My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.

I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.

A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.

This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.

I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.

I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.

Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:

“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.

These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.

I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.

Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.

18:30
Thirdly, prescriptive regulation for AI is difficult because the technology moves so fast, but what will not move fast at all, if ever, is the principles. That is why it will be so valuable to have them set out in the Bill. This is not a prescriptive approach; it is one that specifies the outcomes we want and gives agency to those best placed to bring them about. I strongly commend this approach to noble Lords and look forward to the Minister’s comments.
I turn to other amendments tabled in my name. Amendments 114A and 115A are both necessary to remove the Secretary of State’s regulation-making powers under Clause 80 and Article 22D, and I thank the noble Viscount, Lord Colville, for co-signing them. As the Bill stands, the Secretary of State can, by regulation, decide whether there has or has not been meaningful human involvement in ADM, whether a decision has had an adverse effect similar to that of an adverse legal effect and what safeguards should be in place around an ADM. Like the noble Viscount, Lord Colville, I am concerned here about mission creep and micromanagement. Each of these types of decision would, I feel, be best taken by the data controllers, or the courts, in the event of disputes. I suggest it would be better if the Secretary of State were to publish guidance setting out what should be considered meaningful human involvement and what level of adversity would equate to adverse legal consequences and making suggestions for what would constitute suitable safeguards. This would allow the Government to shape how ADM is deployed while also giving companies using AI-driven ADM flexibility and agency to make it work for their circumstances.
Amendment 116 would require the Secretary of State to provide guidance on how consent should be obtained for ADM. This amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation. Amendment 117 would prevent children giving consent for their special category data to be used in ADM. Special category data reveals some of the most personal details about people’s lives, details which should not be shared without good reason. Allowing children to disclose their special category data raises safeguarding concerns as this information may be, perhaps unwittingly, made available to people unsuited to receive it. In law, we take the view that children lack the life experience to see all ends and should not be allowed to make decisions that could put them in harm’s way. I do not see why we should depart from this wisdom in the context of ADM.
Finally, Amendment 118 would ensure that human intervention in ADM
“is carried out by a person with sufficient competency and authority and is, therefore, effective”.
My view is that this should remove the grounds for concern behind Amendment 114, which would introduce this concept of “predominantly” automated processing. To me, this weakens and obscures the binary elegance and clarity of the rule: either a decision is solely automated or it is not. In the latter case, certain protections kick in. Once we introduce this concept of graduating degrees of automative-ness, we muddy the waters with needless complexity.
All of this depends, though, on a genuinely robust and effective definition of what kind of human input is required. Without Amendment 118, data subjects may find themselves in a situation where they have requested intervention by a human being only to realise that the person doing so does not have sufficient knowledge to understand the nature of the problem nor the power to rectify any problems should they be identified, rendering the whole process not very far from pointless. That said, I will not press my amendments.
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.

I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.

I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.

Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that

“the human involvement has to be active and not just a token gesture”;

that right is absolutely underpinned by the wording of the regulations here.

I turn next to Amendment 111. I can assure—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes. The provisions in this Bill cover exactly that concern.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.

On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.

On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.

Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.

On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.

I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.

I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.

Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

18:45
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.

Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.

On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.

On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.

I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Noble Lords have asked several questions. First, in response to the noble Viscount, Lord Camrose, I think I am on the same page as him about binary rather than muddying the water by having degrees of meaningful intervention. The ICO already has guidance on how human review should be provided, and this will be updated after the Bill to ensure that it reflects what is meant by “meaningful human involvement”. Those issues will be addressed in the ICO guidance, but if it helps, I can write further on that.

I have forgotten the question that the noble Lord, Lord Holmes, asked me. I do not know whether I have addressed it.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

In her response the Minister said “notification”. Does she see notification as the same as “personalised response”?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My understanding is that it would be. Every individual who was affected would receive their own notification rather than it just being on a website, for example.

Let me just make sure I have not missed anyone out. On Amendment 123B on addressing bias in automated decision-making, compliance with the data protection principles, including accuracy, transparency and fairness, will ensure that organisations take the necessary measures to address the risk of bias.

On Amendment 123C from the noble Lord, Lord Clement-Jones, I reassure him that the Government strongly agree that employment rights should be fit for a modern economy. The plan to make work pay will achieve this by addressing the challenges introduced by new trends and technologies. I agree very much with my noble friend Lord Knight that although we have to get this right, there are opportunities for a different form of work, and we should not just see this as being potentially a negative impact on people’s lives. However, we want to get the balance right with regard to the impact on individuals to make sure that we get the best rather than the possible negative effects out of it.

Employment rights law is more suitable for regulating the specific use of data and technology in the workplace rather than data protection law in isolation, as data protection law sets out general rules and principles for processing that apply in all contexts. Noble Lords can rest assured that we take the impact on employment and work very seriously, and as part of our plan to make work pay and the Employment Rights Bill, we will return to these issues.

On Amendments 119, 120, 121 and 122, tabled by the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and my noble friend Lord Knight, the Government share the noble Lords’ belief in the importance of public sector algorithmic transparency, and, as the noble Lord, Lord Clement-Jones, reminded us, we had a very good debate on this last week. The algorithmic transparency recording standard is already mandatory for government departments and arm’s-length bodies. This is a cross-government policy mandate underpinned by digital spend controls, which means that when budget is requested for a relevant tool, the team in question must commit to publishing an ATRS record before receiving the funds.

As I said on Friday, we are implementing this policy accordingly, and I hope to publish further records imminently. I very much hope that when noble Lords see what I hope will be a significant number of new records on this, they will be reassured that the nature of the mandation and the obligation on public sector departments is working.

Policy routes also enable us to provide detailed guidance to the public sector on how to carry out its responsibilities and monitor compliance. Examples include the data ethics framework, the generative AI framework, and the guidelines for AI procurement. Additionally, the data protection framework already achieves some of the intended outcomes of these amendments. It requires organisations, including public authorities, to demonstrate how they have identified and mitigated risks when processing personal data. The ICO provides guidance on how organisations can audit their privacy management and ensure a high level of data protection compliance.

I know I have given a great deal of detail there. If I have not covered all the points that the noble Lords have raised, I will write. In the meantime, given the above assurances, I hope that the noble Lord will withdraw his amendment.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I would be very grateful if the Minister wrote to me about Amendment 115. I have done my best before and after to study Clause 80 to understand how it provides the safeguards she describes, and have failed. If she or her officials could take the example of a job application and the responses expected from it, and take me through the clauses to understand what sort of response would be expected and how that is set out in the legislation, I would be most grateful.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am happy to write.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.

It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.

We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.

The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.

There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.

Amendment 110 withdrawn.
Amendments 111 to 118 not moved.
Clause 80 agreed.
19:00
Amendments 119 to 123C not moved.
Schedule 6 agreed.
Clause 81: Logging of law enforcement processing
Debate on whether Clause 81 should stand part of the Bill.
Member’s explanatory statement
This seeks to retain the requirement for police forces to record the reason they are accessing data from a police database.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.

However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.

As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.

By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.

Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.

Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.

In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.

This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.

Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.

I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.

Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.

Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.

Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.

The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.

Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.

This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.

19:15
The point of the amendment is that the existing data protection legislation requires our police forces to spend huge amounts of time and resources, first, in going through information that has been gathered by investigating officers to identify every single item of personal data contained in that information; secondly, on deciding whether it is necessary or, in many cases, strictly necessary for the CPS to consider each item of personal data when making its charging decision; and then, thirdly, on redacting every item of personal data that does not meet that test. I ask the Committee to imagine, with things such as body cameras being worn by the police today, just how much personal data is being collected by every officer. The Police Federation and the National Police Chiefs’ Council estimate that the national cost of this redaction exercise is approximately £5.6 million per annum, and that, since 1 January 2021, 365,000 policing hours have been consumed with that exercise.
It is potentially unnecessary in the case of any given case file because the CPS decides to charge in approximately only 75% of cases so, in the 25% of cases where the CPS decides not to charge, the unredacted file could simply be deleted by the CPS. Where the CPS decides to charge, the case file could be returned to the police force to then carry out the redaction exercise before there is any risk of that file being disclosed to any person or body other than the CPS.
The simple and practical solution set out in the amendment is for the police to carry out the redaction exercise in relation to any given case file only after the CPS has taken the decision to charge. What is proposed would not remove any substantive protection of the personal data in question. It would not remove the obligation to review and redact the personal data contained in material in the case file. It would simply provide for that review and redaction to be conducted by the police after, rather than before, a charging decision has been made by the CPS.
The Police Federation has discussed this issue and is grateful to the Home Office for the meeting that happened on 25 April. There appear to be two main objections, which I shall touch on. The first was that, even if the redaction of case files before submission to the CPS for a charging decision were not required by the data protection legislation, it is required by Article 8 of the ECHR. The second objection was that it was appropriate for the police to carry out the redaction exercise on case files before submission to the CPS because that would mean the case files would not contain irrelevant personal data, which could give rise to potential pitfalls further down the line.
I will not set out the long explanations but basically, in relation to the point about Article 8, the discussion and considerations have demonstrated clearly that it is the data protection legislation, not Article 8 of the ECHR, which requires the burdensome redaction exercise. Secondly, in relation to the “further down the line” rejection the short answer is that, under the proposal in the amendment, if a decision is made by the CPS not to charge then, as I said, the unredacted file can simply be deleted or placed in secure storage. It would not go any further down the line; it would do so only if a decision was made to charge, in which case the file could be redacted in the usual way.
This change would speed up the criminal justice process. It would reduce considerably the financial burden on the taxpayer and the massive number of police hours committed. Everything we hear from the current Government, with which I have huge amounts of sympathy, says that there is a need to reduce pressure on the public purse and to speed up police time in being able to get on to the streets and do what I think all of us hope they will do: spending time on the streets, supporting victims, catching criminals, not spending hours redacting lots of images from body-worn cameras just in case the CPS happens to use that in a charging decision. I look forward to hearing from the Minister in due course.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have Amendment 201 in this group. At the moment, Action Fraud does not record attempted fraud; it has to have been successful for the website to agree to record it. I think that results in the Government taking decisions based on distorted and incomplete data. Collecting full data must be the right thing to do.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.

Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.

As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.

Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.

I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.

Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.

Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.

I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.

Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

There were some raised eyebrows.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, we could not see the noble Lord’s raised eyebrows.

Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.

We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.

I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.

19:30
The aim of this reform is to provide legal clarity in the Bill to law enforcement agencies in the UK, so that they can embrace the technology they need and make use of international processors with confidence. Such transfers are already permissible under the legislation but we know that there is some ambiguity in how the law can be applied in practice. This reform intends to remove those obstacles. The noble Lord would like to refrain from divergence from EU law. I believe that in this Bill we have drafted the provisions, including this one, with retaining adequacy in mind. As the noble Lord is aware, the Government are committed to maintaining our EU adequacy status.
In addressing the Clause 87 stand part notice, this clause replaces the current national security exemption under the law enforcement regime with a revised version that mirrors the exemptions already available to organisations operating under the UK GDPR and intelligence services regimes. It is essential that competent authorities have access to the full range of exemptions, so that they are properly able to safeguard national security. For instance, if a law enforcement agency is investigating a data subject who it suspects may be involved in an imminent terrorist attack, it is likely to need to share personal data with other agencies at very short notice.
Turning to the stand part notices on Clauses 88 and 89, these two clauses will enable qualifying competent authorities to jointly process data with an intelligence service under part 4 of the Data Protection Act 2018 in circumstances where it is required to safeguard national security. Part 4 of the 2018 Act regulates processing by the intelligence services, so to jointly process data in this manner the Secretary of State must approve the proposed processing by issuing a designation notice. That notice can be issued only following consultation with the ICO and if the Secretary of State is satisfied that processing is necessary to safeguard national security.
These joint partnerships were previously possible under the Data Protection Act 1998, while reports on the Manchester Arena and Fishmonger’s Hall attacks highlight the public interest in closer joint working between law enforcement bodies and the intelligence services in matters of national security. I think the noble Lord also referenced the noble Lord, Lord Anderson. My understanding is that he has given his support to our proposals in the Bill.
With regard to Amendment 201, tabled by the noble Lord, Lord Lucas, attempted fraud can currently be reported to Action Fraud, the national reporting service for fraud and cybercrime. However, I can reassure the noble Lord that an improved service is being worked on, making the best use of technology to ensure the best experience for victims, intelligence for law enforcement and public data on fraud issues. All reports are analysed for intelligence that could support law enforcement in pursuing criminals and keeping the public safe. Key data, including outcomes, are published online and summarised in an interactive dashboard. I understand that further work is taking place on that improved service to replace Action Fraud. I therefore hope that the noble Lord will give the space for those proposals to come forward.
Finally, I turn to Amendment 210, tabled by the noble Baroness, Lady Owen; she is not here to speak to it. I put on record that we share her desire that images used to commit offences under Sections 66A or 66B of the Sexual Offences Act 2003 be removed from convicted offenders. However, there is already a process for this to happen. Under Section 153 of the Sentencing Act 2020, the court has the power to deprive an offender convicted of these offences of any property, including images, used for the purpose of committing those offences. Although judges’ use of these powers is a matter of judicial independence, we will closely examine what changes may be necessary and will revisit this if it is felt that changes are necessary.
Considering all the explanations I have given, I hope that noble Lords will withdraw or not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.

I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.

Clause 81 agreed.
Amendment 124 not moved.
Clauses 82 to 84 agreed.
Amendment 125 not moved.
Schedule 7 agreed.
Schedule 8: Transfers of personal data to third countries etc: law enforcement processing
Amendments 126 to 129 not moved.
Schedule 8 agreed.
Schedule 9 agreed.
Clause 85: Safeguards for processing for research etc purposes
Amendments 130 to 132 not moved.
Clause 85 agreed.
Clauses 86 to 88 agreed.
Clause 89: Joint processing: consequential amendments
Amendment 133
Moved by
133: Clause 89, page 112, line 24, at end insert—
“(10) In section 199(2)(a) of the Investigatory Powers Act 2016 (bulk personal datasets: meaning of “personal data”), after “section 82(1) of that Act” insert “by an intelligence service”.”Member’s explanatory statement
Clause 88 of the Bill amends section 82 in Part 4 of the Data Protection Act 2018 (intelligence services processing). This amendment makes a consequential change to a definition in the Investigatory Powers Act 2016 which cross-refers to section 82.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

These four technical government amendments do not, we believe, have a material policy effect but will improve the clarity and operation of the Bill text.

Amendment 133 amends Section 199 of the Investigatory Powers Act 2016, which provides a definition of “personal data” for the purposes of bulk personal datasets. This definition cross-refers to Section 82(1) of the Data Protection Act 2018, which is amended by Clauses 88 and 89 of the Bill, providing for joint processing by the intelligence services and competent authorities. This amendment will retain the effect of that cross-reference to ensure that processing referred to in Section 199 of the IPA remains that done by an intelligence service.

Amendment 136 concerns Clause 92 and ICO codes of practice. Clause 92 establishes a new procedure for panels to consider ICO codes of practice before they are finalised. It includes a regulation-making power for the Secretary of State to disapply or modify that procedure for particular codes or amendments to them. Amendment 136 will enable the power to be used to disapply or modify the panel’s procedure for specific amendments or types of amendments to a code, rather than for all amendments to it.

Finally, Amendments 213 and 214 will allow for changes made to certain immigration legislation and the Online Safety Act 2023 by Clauses 55, 122 and 123 to be extended via existing powers in those Acts, exercisable by Orders in Council, to Guernsey and the Isle of Man, should they seek this.

I beg to move.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I will keep my comments brief as these are all technical amendments to the Bill. I understand that Amendments 133 and 136 are necessary for the functioning of the law and therefore have no objection. As for Amendment 213, extending immigration legislation amended by Clause 55 of this Bill to the Bailiwick of Guernsey or the Isle of Man, this is a sensible measure. The same can be said for Amendment 214, which extends the provision of the Online Safety Act 2023, amended by this Bill, to the Bailiwick of Guernsey or the Isle of Man.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I thank the noble Viscount.

Amendment 133 agreed.
Clause 89, as amended, agreed.
Clause 90: Duties of the Commissioner in carrying out functions
Amendment 134
Moved by
134: Clause 90, page 113, leave out lines 1 to 5 and insert—
“(a) to monitor the application of GDPR, the applied GDPR and this Act, and ensure are fully enforced with all due diligence;(b) to act upon receiving a complaint, to investigate, to the extent appropriate, the subject matter of the complaint, and to take steps to clarify unsubstantiated issues before dismissing the complaint.”Member’s explanatory statement
This amendment removes the secondary objectives introduced by the Data Use and Access Bill, which frame innovation, competition, crime prevention and national security as competing objectives against the enforcement of data protection law.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, in moving Amendment 134—it is the lead amendment in this group—I shall speak to the others in my name and my Clause 92 stand part notice. Many of the amendments in this group stem from concerns that the new structure for the ICO will diminish its independence. The ICO is abolished in favour of the commission.

19:45
Part 6, which includes Clauses 115 to 118, establishes the information commission to replace the existing regulator. This provision abolishes the ICO and transfers all the duties and responsibilities of the existing commissioner to the new body corporate. Page 9 of the Bill’s Explanatory Notes explains that this change would give the regulator
“a more modern structure—while maintaining its independence”.
However, under Clause 91, the commissioner is required to consult the Secretary of State before preparing or amending codes of practice.
The problem remains, too, that the Secretary of State appoints the most important members of the commission. This ability to appoint has the potential to give the Secretary of State undue influence over the commission’s decision-making processes. What checks and balances, if any, will there be on the identity of commission members? The independence of the Information Commissioner’s Office is a key component that secured the designation of EU-UK data adequacy. We are concerned that the transition to the information commission could dilute its independence, thereby representing another threat to data adequacy. The independence of the information commission is important for maintaining EU-UK data adequacy.
We are concerned that requiring the commissioner to consult the Secretary of State may also present the possibility of political influence. The commissioner may be put under pressure to support the Government’s growth and innovation agenda, which may be in tension with the need to protect data subjects’ personal data and data protection rights. The stand part notices for Clauses 91 and 92 would limit the Secretary of State’s powers and leeway to interfere with the objective and impartial functioning of the new information commission.
We believe that the Government should remove the provisions compelling the commissioner to consult the Secretary of State, thus reaffirming the UK’s commitment to the regulator’s independence. If so amended, the Bill would ensure that the new Information Commissioner is at a sufficient arm’s length from the Government to oversee public and private bodies’ use of personal data with impartiality and objectiveness.
As regards the other amendments in my name, Clause 90 introduces competing and ambivalent objectives that the new information commission would have to pursue, such as
“the desirability of promoting innovation”,
competition,
“public security and national security”,
and preventing crimes. Strong, effective and objective data protection enforcement is important to ensure that innovation results in products and services that benefit individuals and society; to ensure that important public programmes retain the public trust they need to operate; and to ensure that companies compete fairly and are required to improve safety standards. However, Clause 90 builds on the false assumption that objectives such as innovation, economic growth and public security would be competing interests and thus need balancing against data protection. By requiring the new information commission to adopt a more condoning and lenient approach on data protection breaches, Clause 90 could undermine the same policies it aims to promote.
The objective of promoting public trust and confidence in the processing of personal data also represents a significant change in emphasis and tone from the UK GDPR, Article 57 of which articulates the ICO’s task to
“promote public awareness and understanding of the risks, rules, safeguards and rights in relation to processing”.
Amendment 134 would amend Clause 90 to clarify the role and statutory objective of the Information Commissioner’s Office by removing unnecessary and potentially counterproductive objectives. This would clearly state in legislation that the ICO has a duty to investigate infringement and ensure the diligent application of data protection rules. If so amended, Clause 90 would promote clarity and consistency in the ICO’s regulatory function. As the Institute for Government points out:
“Clarity of roles and responsibilities is the most important factor for effectiveness”
of arm’s-length bodies such as the ICO.
I come to Amendment 144. The Information Commissioner’s Office has a poor track record on enforcement. In 2021-22, it did not serve a single GDPR enforcement notice, secured no criminal convictions and issued only four GDPR fines, totalling just £633,000, despite the fact that it received over 40,000 data subject complaints. Open Rights Group’s recently published ICO Alternative Annual Report shows that the ICO issued just one fine and two enforcement notices against public sector bodies, and that
“Only eight UK GDPR-related enforcement actions were taken against private sector organisations”.
In contrast, the ICO issued 28 reprimands to the public sector over the last financial year. Reprimands are written statements where the ICO expresses regret over an organisation’s failure to comply with data protection law, but they do not provide any incentive for change. A reprimand lacks legal force and organisations face no further consequences from one. Despite the fact that reprimands clearly lack deterrence, the ICO relies on them extensively and for serious violations of data protection laws.
I shall give a few examples. Police, prosecutors or the NHS have exposed personal address details of victims of abuse, or witnesses to crime, to their abusers or those they were accusing, creating immediate personal and physical risks. In one example, the person affected had to move house. In another example, patients of the University Hospitals of Derby and Burton NHS Foundation Trust did not receive medical treatment for up to two years. Two police authorities, West Mercia Police and Warwickshire Police, lost the detailed records of investigations they had made, which could have impacted prosecutions or caused potential miscarriages of justice. Two police authorities, Sussex Police and Surrey Police, recorded the conversations of hundreds of thousands of individuals without their consent. There were also persistent failures by two police authorities and three local authorities to respond to subject access requests in a timely period over periods of up to five years.
The ICO decided to drop Open Rights Group’s and several members of the public’s complaints against Meta’s reuse of personal data to train AI without carrying out any meaningful probe, despite substantiated evidence that Meta’s practices do not comply with data protection law. This includes the fact that pictures of children on parents’ Facebook profiles could end up in Meta’s AI model as it assumes consent, yet the ICO has not even launched an investigation.
Evidence proves that overreliance on reprimand lacks deterrence for lawbreakers. For instance, the Home Office was issued three consecutive reprimands in 2022 for a number of data protection breaches, recording and publishing conversations with Windrush victims without consent, and a systemic failure to answer subject access requests within statutory limits, with over 22,000 requests handled late. Against this background, the ICO issued yet another reprimand to the Home Office in 2024. The Home Office’s persistence in not complying with data protection law is a good example of how reprimands, if not supported by the threat of substantive enforcement action, fail to provide a deterrence and thus get ignored by the public sector.
The fact is that the ICO has consistently relied on non-binding and highly symbolic enforcement actions to react to serious infringements of the law. Indeed, the Information Commissioner has publicly stated his intention not to rely on ineffective enforcement against big private sector organisations because
“fines against big tech companies are ineffective”.
This opinion has, of course, been widely disputed by data protection experts and practitioners, including the former Information Commissioner, Elizabeth Denham.
Amendment 144 would impose a limit on the number of reprimands that the ICO can give to a given organisation without adopting any substantive regulatory action, such as an enforcement notice and a fine. This would ensure that the ICO could not evade its regulatory responsibilities by adopting enforcement actions that lack deterrence or the force of law.
Amendments 163 to 166 and 168 to 192 to Schedule 14 are designed to replace the involvement of the Secretary of State with the commissioner and transfer the responsibility to appoint the commissioner from the Government to Parliament. They would also modify Schedule 14 to transfer budget responsibility in the appointment process of the non-executive members of the Information Commission to the relevant Select Committee.
The Bill as drafted will provide significant powers for the Secretary of State to interfere with the objective and impartial functioning of the new Information Commissioner, such as by appointing non-executive members of the newly formed Information Commission, or by introducing a requirement for the new Information Commission to consult the Secretary of State before laying a code of practice before Parliament.
The monitoring and enforcement of data protection laws must be carried out objectively and free from partisan or extralegal considerations but there appears to be a lack of criticality—speaking truth to power—in the present ICO. The commissioner expressed views on the DPDI Bill that match those of the Government, despite widespread criticism coming from other arm’s-length bodies such as the National Data Guardian, the Biometrics and Surveillance Camera Commissioner, the Scottish Biometrics Commissioner and the Equality and Human Rights Commission. The Information Commissioner has once again welcomed this Bill, despite the fact that the new Bill dropped several provisions of the old DPDI Bill that the ICO was previously supportive of. Where is the objective and constructive feedback on government policies?
The other amendments in this group are designed to remove the involvement of the Secretary of State and transfer the responsibility to appoint the commissioner from the Government to Parliament. Amendment 167A would ensure that non-executive members of the commission have a sufficient balance of expertise to inform the commission outside purely data protection issues. There is concern that the ICO will simply draw its NEDs from the same narrow profile of data protection lawyers as has previously been the case. We know from the European Union that it is important that regulators understand the broader horizon and appropriately balance GDPR enforcement with other fundamental rights, such as civil liberties and the economic impact that rulings can have. Will the Minister agree that the ICO should be looking for a broad range of expertise that can aid its decision-making in the reformed structure? I beg to move.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

I have Amendment 135A in this group. The Bill provides a new set of duties for the Information Commissioner but no strategic framework, as the DPDI Bill did. The Information Commissioner is a whole-economy regulator. To my mind, the Government’s strategic priorities should bear on it. This amendment would provide an enabling power, such as that which the Competition and Markets Authority, which is in an equivalent economic position, already has.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I have huge sympathy for, and experience of, many of the issues raised by the noble Lord, Lord Clement-Jones, but, given the hour, I will speak only to Amendment 145 in my name and those of the noble Baroness, Lady Harding, my noble friend Lord Russell and the noble Lord, Lord Stevenson. Given that I am so critical, I want to say how pleased I am to see the ICO reporting requirements included in the Bill.

Amendment 145 is very narrow. It would require the ICO to report specifically and separately on children. It is fair to say that one of the many frustrations for those of us who spend our time advocating for children’s privacy and safety is trying to extrapolate child-specific data from generalised reporting. Often it is not reported because it is useful to hide some of the inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snapchat provides a breakdown of the violation rate by age group, even though that would provide valuable information for academics, Governments, legislators, NGOs and, of course, regulators. It was a point of contention between many civil society organisations and Ofcom that there was no evidence that children of different ages react in different ways, which, for anyone who has had children, is clearly not the case.

Similarly, for many years we struggled to understand Ofcom’s reporting because older children were included in a group that went up to 24, and it took over 10 years for that to change. It seems to me—I hope the Government agree—that since children are entitled to specific data privacy benefits, it follows that the application and enforcement of those benefits should be reported separately. I hope that the Government can give a quick yes on this small but important amendment.

20:00
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.

Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.

Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.

On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.

I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.

I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?

I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.

Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.

I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.

The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.

I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.

Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.

Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.

I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.

Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.

Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.

I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.

Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.

Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.

Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.

The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.

It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.

I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.

There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.

Amendment 134 withdrawn.
Committee adjourned at 8.14 pm.