Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for her engagement and for defining what genuine scientific research is. I hope very much that the AI companies, when using this extraordinary exemption, will listen to the Government, and that the Government will ensure that the policy is enforced. The trust of the people of this country would be lost if they felt that their data was being reused by AI companies simply for product enrichment and profit, rather than for genuine scientific research. I thank the noble Viscount, Lord Camrose, and the noble Lord, Lord Clement-Jones, for their parties’ support.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I too thank the Minister for her introduction to the three Motions in this group.

On these Benches, we welcome the Supreme Court’s judgment on the meaning of “sex” in the Equality Act 2010. However, as Ministers have stressed—and we agree—it is paramount that we work through the implications of this judgment carefully and sensitively. As we have previously discussed, the EHRC is currently updating its statutory guidance.

Ministers have previously given assurances that they are engaged in appropriate and balanced work on data standards and data accuracy, and we accept those assurances. They have given a further assurance today about how the digital verification services framework will operate. We rely on those ministerial assurances. In summary, we believe that the previously proposed amendments were premature in the light of the EHRC guidance and that they risk undermining existing data standards work. On that basis, we support the Minister in her Motions A and D.

Turning to Motion B, the noble Viscount, Lord Colville, will not press his Amendment 43B at this stage, as he intends to accept the assurances given by Ministers. We have consistently supported the noble Viscount’s efforts to ensure that scientific research benefiting from the Bill’s provisions for data reuse is conducted according to appropriate ethical, legal and professional frameworks. The Government have given significant assurances in this area. We understand that their position is that the Bill does not alter the existing legal definition or threshold for what constitutes scientific research under UK GDPR. The Bill does not grant any new or expanded permissions for the reuse of data for scientific research purposes, and, specifically, it does not provide blanket approval for using personal data for training AI models under the guise of scientific research. The use of personal data for scientific research remains subject to the comprehensive safeguards of UK GDPR, including the requirement for a lawful basis, the adherence to data protection principles and the application of the reasonableness test, which requires an objective assessment.

The collection of assurances given during several stages of the Bill provides reassurance against the risk that commercial activities, such as training AI models purely for private gain, could improperly benefit from exemptions intended for genuine scientific research serving the public good. I very much hope that the Minister can reaffirm these specific points and repeat those assurances.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank noble Lords for their contributions. I reassure your Lordships’ House that the Government are progressing workstreams focused on the accuracy and reliability of sex data in public authority datasets in a holistic and measured manner, as I have described in previous debates. We welcome the Supreme Court ruling, and are now working hard to consider those findings and the upcoming guidance from the equalities regulator, which will help.

I reiterate that the trust framework requires DVS providers to comply with data protection legislation, including the data accuracy principle, where they use and share personal data. That includes the creation of reusable digital identities, as well as one-off checks. If they fail to comply with these requirements, they could lose their certification. This means that the sex information listed on a passport—which, as we all know, could be a combination of biological sex, legal sex under the Gender Recognition Act and gender identity—cannot be used to verify biological sex.

The noble Lord, Lord Arbuthnot, asked whether a person can have different genders appearing on different documents. Yes, you could have both genders appearing on different documents, but they could not be used to prove biological sex.

I should say to noble Lords that there is a requirement for all this information to be recreated, reused and rechecked each time. In response to noble Lords who asked about historic data, the data will be renewed and checked under the new information that is now available.

In the majority of cases where DVS are used, there will not be a need to verify biological sex, as we have noted before, because many DVS requirements do not ask that question. Data sharing under the power created in Clause 45 will involve new processing of data, which must be in compliance with the data accuracy principle: that is, it must be accurate for the purpose for which the information will be used. Of particular relevance, given that public authorities will be sharing data for verification purposes, is the fact that data accuracy principles require that the personal data must not be misleading.

With regard to the question from the noble Baroness, Lady Ludford, about supplementary codes of practice, I can confirm that the trust framework already includes requirements on data accuracy for DVS providers. That framework will, of course, be updated from time to time.

On scientific research, let me repeat my thanks to the noble Viscount, Lord Colville, for his contribution on this issue. I am glad that he was reassured by my remarks that we have been able to come to an agreeable resolution. I very much concur with the comments of the noble Lord Clement-Jones, that there has to be an ethical basis to those standards, and that point is absolutely well made.

On that basis, I hope I have reassured noble Lords. I commend the Motion to the House.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I declare an interest as chair of the Authors’ Licensing and Collecting Society.

I express the extremely strong support of all on these Benches for Motion C1, proposed by the noble Baroness, Lady Kidron. I agree with every speech that we have heard so far in today’s debate—I did not hear a single dissenting voice to the noble Baroness’s Motion. Once again, I pay tribute to her; she has fought a tireless campaign for the cause of creators and the creative industries throughout the passage of the Bill.

I will be extremely brief, given that we want to move to a vote as soon as possible. The House has already sent a clear message by supporting previous amendments put forward by the noble Baroness, and I hope that the House will be as decisive today. As we have heard this afternoon, transparency is crucial. This would enable the dynamic licensing market that is needed, as we have also heard. How AI is developed and who it benefits are two of the most important questions of our time—and the Government must get the answer right. As so many noble Lords have said, the Government must listen and must think again.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, it is probably redundant to pay tribute to the noble Baroness, Lady Kidron, for her tenacity and determination to get to a workable solution on this, because it speaks for itself. It has been equally compelling to hear such strong arguments from all sides of the House and all Benches—including the Government Benches—that we need to find a solution to this complex but critical issue.

Noble Lords will recall that, on these Benches, we have consistently argued for a pragmatic, technology-based solution to this complex problem, having made the case for digital watermarking both in Committee and on Report. When we considered the Commons amendments last week, we worked closely with the noble Baroness, Lady Kidron, to find a wording for her amendment which we could support, and were pleased to be able to do so and to vote with her.

It is important that the Government listen and take action to protect the rights of creatives in the UK. We will not stop making the case for our flourishing and important creative sector. We have put that case to Ministers, both in your Lordships’ House and at meetings throughout the passage of the Bill. As a responsible Opposition, though, it is our view that we must be careful about our approach to amendments made by the elected House. We have, I hope, made a clear case to the Government here in your Lordships’ House and the Government have, I deeply regret to say, intransigently refused to act. I am afraid that they will regret their failure to take this opportunity to protect our creative industries. Sadly, there comes a point where we have to accept that His Majesty’s Government must be carried on and the Government will get their Bill.

Before concluding, I make two final pleas to the Minister. First, as others have asked, can she listen with great care to the many artists, musicians, news organisations, publishers and performers who have called on the Government to help them more to protect their intellectual property?

Secondly, can she find ways to create regulatory clarity faster? The process that the Government envisage to resolve this issue is long—too long. Actors on all sides of the debate will be challenged by such a long period of uncertainty. I understand that the Minister is working at pace to find a solution, but not necessarily with agility. I echo the brilliant point made by my noble friend Lady Harding that agility and delivering parts of the solution are so important to pick up the pace of this, because perfect is the enemy of good in this instance. When she gets up to speak, I hope that the Minister will tell us more about the timeline that she envisages, particularly for the collaboration of DSIT and DCMS.

This is a serious problem. It continues to grow and is not going away. Ministers must grip it with urgency and agility.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Lord Tarassenko Portrait Lord Tarassenko (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am authorised to speak on Motion 43A, as someone with regular day-to-day experience of scientific research. Since I started my PhD in 1981, I have had the privilege of spending more than half my working life doing scientific research in the UK—the last 20 years working with very sensitive patient data. Most of that research has been carried out in an academic setting, but some of it has been in collaboration with medtech, AI and pharmaceutical companies.

This research has required me to become familiar with many three-letter and four-letter acronyms. Noble Lords will know about DBS, but they might not know about RSO, TRO, HRA, LREC, MREC, CAG, and IRAS, to name just a few. I have spent hundreds of hours working with clinical colleagues to fill in integrated research application system—IRAS—forms. IRAS is used to apply for Health Research Authority—HRA—approval for research projects involving the NHS, social care or the criminal justice system. I have appeared before not only medical research ethics committees, or MRECs, which test whether a research protocol is scientifically valid and ethical, but local research ethics committees, or LRECs, which consider the suitability of individual researchers and local issues.

I was involved in a research project which reused data acquired from patients on a Covid isolation ward during the first two waves of the pandemic. That research project sought to understand how nurses interpreted continuous data from the clinical-grade wearables we used to monitor these high-risk patients during Covid. It took our research team more than 18 months to obtain the relevant permissions to reuse the data for our proposed analysis. Our application was reviewed by the Confidentiality Advisory Group—CAG—which provides independent expert advice on the use of confidential patient information without consent for research and non-research purposes. CAG already considers whether accessing the confidential data is justified by the public interest. Its advice is then used by the HRA and the Secretary of State for Health and Social Care to decide whether to grant access to the confidential data.

The existing provisions in this country to allow access to data for research purposes are stringent, and it is entirely right that they should be. The UK is respected the world over for the checks and balances of its research governance. The relevant safeguards already exist in the current legislation. Adding a further public interest test will only increase the amount of bureaucracy that will inevitably be introduced by the research services offices, or RSOs, and the translational research offices, or TROs, of our universities, which are very good at doing this.

The extra burden will fall on the researchers themselves, and some researchers may decide to concentrate their available time and energy elsewhere. This amendment, I am afraid, will have the unintended consequence of having a negative impact on research in this country, so I cannot support it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, an onlooker might be forgiven for not perceiving a common theme in this group of amendments, but I thank the Minister for his introduction and the noble Viscounts for introducing their amendments so clearly.

I acknowledge that Motion 32A and Amendments 32B and 32C and Motion 52A and Amendments 52B and 52C from the noble Viscount, Lord Camrose, are considerably less prescriptive than the Spencer amendment in the House of Commons to introduce new Clause 21, which seemed to require public authorities to comb through every record to rectify data, went significantly further than the findings of the Supreme Court judgment, and potentially failed to account for the privacy afforded to GRC holders under the Gender Recognition Act. However, the Liberal Democrats will abstain from votes on the noble Viscount’s amendments for several key reasons.

Our primary reason is the need to allow time for the EHRC’s guidance to be finalised. I thought the Minister made his case there. The EHRC is currently updating its code of practice, as we have heard, to reflect the implications of the Supreme Court judgment on the meaning of sex in the Equality Act, with the aim of providing it to the Government by the end of June. This guidance, as I understand it, is intended specifically to support service providers, public bodies and others in understanding their duties under the Equality Act and putting them into practice in the light of the judgment. The EHRC is undertaking a public consultation to understand how the practical implications can best be reflected. These amendments, in our view, are an attempt to jump the gun on, second-guess or at the least pre-empt the EHRC’s code of practice.

On these Benches, we believe that any necessary changes or clarifications regarding data standards should be informed by the official guidance and implemented consistently in a coherent and workable manner. We should allow time for the EHRC’s guidance to be finalised, ensuring that any necessary changes or clarifications regarding data standards are informed by its advice and implemented consistently across public authorities in a coherent and workable manner. We have concerns about workability and clarity. Although the amendments proposed by the noble Viscount, Lord Camrose, are less prescriptive than previous similar proposals in the Commons tabled by Dr Spencer, we have concerns about their practical implementation. Questions arise about how public authorities would reliably ascertain biological sex if someone has a gender recognition certificate and has updated their birth certificate. I have long supported same-sex wards in the NHS, but I do not believe that these amendments are helpful in pursuing clarity following the Supreme Court judgment. We heard what the Minister had to say about passports.

I welcome the clarity provided by the Supreme Court judgment, but there are clearly implications, both practical and legal, to be worked out, such as those mentioned by the noble Viscount, Lord Hailsham. I thought he put his finger on many of those issues. I trust that the EHRC will deliver the right result. I agree that data needs to be accurate, and I welcome the Sullivan report, as did my noble friend. In summary, we will be abstaining. We believe that the EHRC process needs to conclude and provide comprehensive guidance, while also reflecting concerns about the workability and appropriateness of specific legislative interventions on data standards at this time.

I move on to Amendment 43B, tabled by the noble Viscount, Lord Colville. This amendment may not reinstate the precise wording

“conducted in the public interest”

that we previously inserted in this House, but it would introduce safeguards that seek to address the same fundamental concerns articulated during our debate on Report. It does two important things.

First, it provides a definition of “scientific research”, clarifying it as

“creative and systematic work undertaken in order to increase the stock of knowledge”.

This directly addresses the concerns raised on Report that the line between product development and scientific research is often blurred, with developers sometimes positing efforts to increase model capabilities or study risks as scientific research. Having a clear definition helps to distinguish genuine research from purely commercial activity cloaked as such.

Secondly, and critically, Amendment 43B would require:

“To meet the reasonableness test”


already present in the Bill,

“the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards”.

This requirement seeks to embed within the reasonableness test the principles that underpinned our arguments for the public interest requirement on Report and is the same as the amendment put forward by the chair of the Science, Innovation and Technology Select Committee, Chi Onwurah MP, which ties the definition to the definition in the OECD’s Frascati Manual: Guidelines for Collecting and Reporting Data on Research and Experimental Development:

“creative and systematic work undertaken in order to increase the stock of knowledge—including knowledge of humankind, culture and society—and to devise new applications of available knowledge”.

The Frascati framework is used worldwide by Governments, universities and research institutions to report R&D statistics, inform science policy and underpin R&D tax credit regimes, and it serves as a common language and reference point for international comparisons and policy decisions related to scientific research and innovation. These frameworks, obligations and standards are important because they serve the very purposes we previously identified for the public interest test: ensuring societal benefit, building public trust, preventing misuse for commercial ends, addressing harmful applications, and alignment with standards.

Amendment 43B in the name of the noble Viscount, Lord Colville, is a thoughtful and necessary counter-proposal. It is Parliament’s opportunity to insist that the principles of public benefit, trust and responsible conduct, rooted in established frameworks, must remain central to the definition of scientific research that benefits from data re-use exceptions.

I heard what the noble Lord, Lord Winston, had to say in his very powerful speech, but I cannot see how the amendment from the noble Viscount, Lord Colville, cuts across all the things that he wants to see in the outcomes of research.

Lord Winston Portrait Lord Winston (Lab)
- Hansard - - - Excerpts

As the noble Lord has mentioned my name, I simply ask him this question: does he recall the situation only some 45 years ago when there was massive public outcry about in vitro fertilisation, when there were overwhelming votes against in vitro fertilisation in both Houses of Parliament on two occasions, and when, finally, a Private Member’s Bill was brought, which would have abolished IVF in this country? Had that happened, of course, an amendment such as this would have prevented the research happening in England and would have made a colossal difference not only to our knowledge of embryo growth, but our knowledge of development, ageing, the development of cancer and a whole range of things that we never expected from human embryology. I beg the noble Lord to consider that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I have had a misspent not-so-youth over the past 50 years. As a lawyer, when I read the wording in the amendment, I cannot see the outcome that he is suggesting. This wording does not cut across anything that he has had to say. I genuinely believe that. I understand how genuine he is in his belief that this is a threat, but I do not believe this wording is such a threat.

I also understand entirely what the noble Lord, Lord Tarassenko, had to say, but an awful lot of that was about the frustration and some of the controls over health data. That does not apply in many other areas of scientific research. The Frascati formula is universal and well accepted. The noble Viscount made an extremely good case; we should be supporting him.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Viscount, Lord Camrose, for his Motion 32A and Amendments 32B and 32C, and Motion 52A and Amendments 52B and 52C. I reiterate that this Government have been clear that we accept the Supreme Court judgment on the meaning of sex for equalities legislation. However, as the noble Viscount, Lord Hailsham, says, it is critically important that the Government work through the effect of this ruling with care, sensitivity and in line with the law.

When it comes to public sector data, we must work through the impacts of this judgment properly. This would involve considering the scope of the judgment and the upcoming EHRC guidance. Critically, the Equality and Human Rights Commission has indicated that it will be updating its statutory code of practice for services, public functions and associations in light of this ruling, which will include some of the examples raised this afternoon, including by my noble friend Lady Hayter.

Ministers will consider the proposals once the EHRC has submitted its updated draft. It is right that the Government and, indeed, Parliament fully consider this guidance alongside the judgment itself before amending the way that public authorities collect, hold and otherwise process data—a point made by the noble Lord, Lord Clement-Jones, about the EHRC ruling.

I set out in my opening speech that this Government take the issue of data accuracy seriously. That is why, as I outlined, there are numerous existing work streams addressing the way in which sex and gender data are collected and otherwise processed across the public sector.

The digital verification services amendments that we have discussed today are misplaced, because the Bill does not alter the evidence and does not seek to alter the content of data used by digital verification services. Instead, the Bill enables people to do digitally what they can do physically. It is for organisations to consider what specific information they need to verify their circumstances, and how they go about doing that. Any inconsistency between what they can do digitally and what they can do physically would cause further confusion.

While this Government understand the intention behind the amendments, the concerns regarding the way in which public authorities process sex and gender data should be considered holistically, taking into account the effects of the Supreme Court ruling, the upcoming guidance from the equalities regulator and the specific requirements of public authorities. It is very unlikely that the digital verification services would be used for many of the cases specifically raised by or with many noble Lords. We expect DVS to be used primarily to prove things like one’s right to work or one’s age, address or professional educational qualifications.

The noble Viscount, Lord Hailsham, rightly highlights that the proposals have the potential to interfere with the right to respect for private and family life under the Human Rights Act by, in effect, indiscriminately and indirectly pushing public authorities to record sex as biological sex in cases where it is not necessary or proportionate in that particular circumstance. I raise the example that has been brought up several times, and again by the noble Baroness, Lady Fox: it is not relevant for the French passport officer to know your biological sex. That is not the purpose of the passport.

We acknowledge, however, that there are safeguards that address the concerns raised by noble Lords, including those of the noble Viscount, Lord Camrose, and the noble Lord, Lord Arbuthnot, regarding information being shared under Clause 45 but without presenting issues that could cut across existing or prospective legislation and guidance. I remind the House that the data accuracy principle is already included in law. The principle requires that only data accurate for the purpose for which it is held can be used. Again, there are workstreams looking at data use to answer the points raised by the noble Lord, Lord Arbuthnot, and indeed by the noble and learned Baroness, Lady Butler-Sloss.

The noble Baroness, Lady Ludford, asked why it was not accurate for 15 years and what that means about our reliance on this accuracy. I am afraid the fact is that it was accurate for 15 years because there was a muddle about what was being collected. There was no requirement to push for biological sex, but that is the case now. In response to the question of whether you could end up with two different sources of digital verification showing two different biological sexes, the answer is no.

--- Later in debate ---
Lord Freyberg Portrait Lord Freyberg (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support Motion 49A from the noble Baroness, Lady Kidron. I will also address claims that we have heard repeatedly in these debates: that transparency for AI data is technically unfeasible. This claim, forcefully pushed by technology giants such as Google, is not only unsupported by evidence but deliberately misleading.

As someone with a long-standing background in the visual arts, and as a member of DACS—the Design and Artists Copyright Society—I have witnessed first-hand how creators’ works are being exploited without consent or compensation. I have listened carefully to the concerns expressed by the noble Lord, Lord Tarassenko, in both his email to colleagues today and the letter from entrepreneurs to the Secretary of State. Although I deeply respect their expertise and commitment to innovation, I must firmly reject their assessment, which echoes the talking points of trillion-dollar tech corporations.

The claims by tech companies that transparency requirements are technically unfeasible have been thoroughly debunked. The LAION dataset already meticulously documents over 5 billion images, with granular detail. Companies operate crawler services on this dataset to identify images belonging to specific rights holders. This irrefutably demonstrates that transparency at scale is not only possible but already practised when it suits corporate interests.

Let us be clear about what is happening: AI companies are systematically ingesting billions of copyrighted works without permission or payment, then claiming it would be too difficult to tell creators which works have been taken. This is theft on an industrial scale, dressed up as inevitable technological progress.

The claim from the noble Lord, Lord Tarassenko, that these amendments would damage UK AI start-ups while sparing US technology giants is entirely backwards. Transparency would actually level the playing field by benefiting innovative British companies while preventing larger firms exploiting creative works without permission. I must respectfully suggest that concerns about potential harm to AI start-ups should be balanced against the devastating impact on our creative industries, thousands of small businesses and individual creators whose livelihoods depend on proper recognition and compensation for their work. Their continued viability depends fundamentally on protecting intellectual property rights. Without transparency, how can creators even begin to enforce these rights? The question answers itself.

This is not about choosing between technology and creativity; it is about ensuring that both sectors can thrive through fair collaboration based on consent and compensation. Transparency is not an obstacle to innovation; it is the foundation on which responsible, sustainable innovation is built.

Google’s preferred approach would reverse the fundamental basis of UK copyright law by placing an unreasonable burden on rights holders to opt out of having their work stolen. This approach is unworkable and would, effectively, legalise mass copyright theft to benefit primarily American technology corporations.

Rather than waiting for a consultation outcome that may take years, while creative works continue to be misappropriated, Motion 49A offers a practical step forward that would benefit both sectors while upholding existing law. I urge the House to support it.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it has been a privilege to listen to today’s debate. The noble Baroness, Lady Kidron, really has opened the floodgates to expressions of support for human creativity. I thank her for tabling her Motion. I also thank the Minister for setting out the Government’s position and their support for the creative industries.

I suppose I straddle the world of AI and creativity as much as anybody in this House. I co-founded the All-Party Group on Artificial Intelligence and I have been a member of the All-Party Group on Intellectual Property for many years. That is reflected in my interests, both as an advisor to DLA Piper on AI policy and regulation, and as the newly appointed chair of the Authors’ Licensing and Collecting Society. I declare those interests, which are more than merely formal.

The subject matter of the amendments in this group is of profound importance for the future of our creative industries and the development of AI in the UK: the critical intersection of AI training and copyright law, and, specifically, the urgent need for transparency. As the noble Baroness, Lady Kidron, described, the rapid development of AI, particularly large language models, relies heavily on vast volumes of data for training. This has brought into sharp focus the way copyright law applies to such activity. It was impossible to miss the letter over the weekend from 400 really important creatives, and media and creative business leaders urging support for her Motion 49A. Rights holders, from musicians and authors to journalists and visual artists, are rightly concerned about the use of their copyrighted material to train AI models, often without permission or remuneration, as we have heard. They seek greater control over their content and remuneration when it is used for this purpose, alongside greater transparency.

Like others, I pay tribute to the noble Baroness, Lady Kidron, who has brilliantly championed the cause of creators and the creative industries throughout the passage of this Bill in her tabling of a series of crucial amendments. Her original amendments on Report, passed in this House but deleted by the Government in the Commons and then retabled in the Commons on Report by my honourable friends, aimed to make existing UK copyright law enforceable in the age of generative AI. The core argument behind Amendment 49B, which encapsulates the essence of the previous amendments, is that innovation in the AI field should not come at the expense of the individuals and industry creating original content.

The central plank of the noble Baroness’s proposals, and one these Benches strongly support, is the requirement for transparency from AI developers regarding the copyrighted material used in their training data. Her Amendment 49B specifically requires the Secretary of State to make regulations setting out strict transparency requirements for web crawlers and general-purpose AI models. This would include disclosing the identity and purpose of the crawlers used, identifying their owners and, crucially, keeping records of where and when copyrighted material is gathered. This transparency is vital for ensuring accountability and enabling copyright holders to identify potential infringements and enforce their rights.

The Minister described the process in the consultation on AI and copyright, published last December. That consultation proposed a text and data mining exception that would allow AI developers to train on material unless the rights holder expressly reserved their rights or opted out. The arguments against this proposed opt-out mechanism are compelling; they have been made by many noble Lords today and have been voiced by many outside, as we have heard. This mechanism shifts the burden on to creators to police the use of their work and actively opt out, placing an undue responsibility on them.

This approach undermines the fundamental principles of copyright, effectively rewarding the widespread harvesting or scraping of copyrighted material that has occurred without permission or fair remuneration. The Government’s proposed text and data-mining exception, which it appears that they are no longer proposing—as the noble Lord, Lord Brennan, asked, perhaps the Minister can clarify the Government’s position and confirm that that is indeed the case—risks harming creative sectors for minimal gain to a small group of global tech companies and could erode public trust in the AI sector. As the noble Baroness observed, this approach is selling the creative industries down the river. Voluntary measures for transparency proposed by the Government are insufficient. Clear legal obligations are needed.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Chakrabarti, has said everything I was going to say and more and better, so I want just to pay tribute to the noble Baroness, Lady Owen of Alderley Edge, and to say that I too have witnessed her forensic fight over the last few months. I hugely admire her for it, and I congratulate her on getting this far. I absolutely share all the concerns that both noble Baronesses have expressed. Just in case I do not have the opportunity again, I congratulate the noble Baroness on her extraordinary work and campaigning.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, it is a pleasure to follow the three noble Baronesses, and I too congratulate the noble Baroness, Lady Owen, on her magnificent and successful campaign to outlaw the making and requesting of non-consensual images, first with her Private Member’s Bill and then with amendments to this Bill. She has fought it with huge skill and determination, and, rightly, she has pushed it to the wire in wanting the most robust offence and tightest defences possible. I thank the Minister for his flexibility that he has shown so far—with the emphasis on “so far”.

The amendments that the noble Baroness has put forward represent a compromise, given the strong and rather extraordinary opinion of the Attorney-General that the defence of “reasonable excuse” is needed for the defence to be compliant with the ECHR and that, therefore, the whole Bill risks being non-compliant if that is not contained in the defence for these offences. That is the equivalent of a legal brick wall, despite an excellent opinion from Professor Clare McGlynn, which in my view demolished the Attorney-General’s case, which seems to be based on ensuring the ability of big tech companies to red team their models on images used without consent. That is a rather peculiar basis. Why cannot the big tech companies use images with consent? They would then be red teaming in a rather different and more compliant way.

AI: Child Sexual Abuse Material

Lord Clement-Jones Excerpts
Wednesday 30th April 2025

(1 month ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

The noble Baroness is quite right that we have to keep the technology up to date, and of course we are endeavouring to do that. I should say that UK law applies to AI-generated CSAM in the same way as to real child sexual abuse. Creating, possessing or distributing any child sex abuse images, including those generated by AI, is illegal. Generative AI child sexual abuse imagery is priority illegal content under the Online Safety Act in the same way as real content. However, she is quite right: we have to keep abreast of the technology. We are working at pace across government to make sure that we have the capacity to do that.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, the Children’s Commissioner, Dame Rachel de Souza, and the IWF have both called for a total ban on apps which allow nudification, where photos of real people are edited by AI to make them appear naked. The commissioner has been particularly critical about the fact that such apps

“go unchecked with extreme real-world consequences”.

Will the Government act and ban these AI-enabled tools outright?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord for that question. The Government are actively looking at options to address nudification tools, and we hope to provide an update shortly. It is a matter that we take seriously. If such tools are used to create child sexual abuse material, UK law is clear that creating, possessing or distributing child sexual abuse images, including those generated using nudification tools, is already illegal, regardless of whether it depicts a real child or not.

Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025

Lord Clement-Jones Excerpts
Tuesday 25th March 2025

(2 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, the Government take the security of public telecoms seriously. As noble Lords know, the Telecommunications (Security) Act 2021 received Royal Assent on 17 November 2021. The Act established powers to introduce a new telecommunications security framework and introduced new vendor security powers. It is these vendor security powers that are relevant to this statutory instrument.

The Act allows the Secretary of State to issue a designation notice to a vendor whose presence in the UK networks poses national security risks, and designated vendor directions to public communications providers placing controls on their use of equipment or services by a designated vendor. The Act also gives the Secretary of State powers to impose a penalty on a public communications provider that does not comply with a designated vendor direction issued to it. That penalty can be up to 10% of a provider’s turnover. The Act states that the Secretary of State must set out rules for how they intend to calculate a provider’s turnover. That includes what relevant business the Secretary of State will take into account when calculating that turnover.

The Electronic Communications (Networks and Services) (Penalties) (Rules for Calculation of Turnover) Order 2003 sets out rules for Ofcom to calculate a provider’s turnover when it contravenes conditions set under the Communications Act 2003. The statutory instrument makes changes to the 2003 order so that rules in that legislation apply when calculating turnover for the purposes of determining a penalty for enforcement of designated vendor directions. It also defines what is to be treated as a network service facility or business by reference to which the calculation of turnover is to be made.

The Secretary of State could have relied on the 2003 order for the purposes of enforcement of a designated vendor direction. However, this SI removes any ambiguity and provides legal certainty and absolute clarity on the rules that apply. Turnover will be calculated in line with accounting practices and principles generally accepted in the United Kingdom and will be limited to the amount derived by that provider after the deduction of relevant taxes.

In conclusion, this is a narrowly focused but important statutory instrument through which we are ensuring legal certainty and clarity. It makes clear the Secretary of State’s approach to calculating turnover, which will underpin any decision to penalise a provider in relation to the designated vendor directions. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her introduction to this draft statutory instrument; it was brief and to the point. These penalties will be able to reach 10% of turnover or £100,000 per day for continuing breaches, so getting the calculations right is crucial. However, I have some concerns about the SI, the first of which is about timing.

I do not understand why we are looking at a three-year gap between the enabling powers and the calculation rules. The Telecommunications (Security) Act 2021, which I worked on, was presented to this House as urgent legislation to protect critical national infrastructure, yet here we are, in 2025, only now establishing how to calculate penalties for breaches in the way set out in this SI. During this period, we have had enforcement powers without the ability to properly determine penalties. As I understand it, tier 1 providers had to comply by March 2024, yet the penalty calculation mechanism will not be in place until this year—no doubt in a few weeks’ time.

Secondly, there is the absence of consultation. The Explanatory Memorandum cites the reason as the SI’s “technical nature”, but these penalties—I mentioned their size—could have major financial implications for providers. The telecoms industry has complex business structures and revenue streams. Technical expertise from the industry could have helped to ensure that these calculations are practical and comprehensive. The technical justification seems remarkably weak, given the impact these rules could have. For example, the current definition of “relevant business” for these calculations focuses on traditional network and service provision, but modern telecoms companies often have diverse revenue streams. There is no clear provision for new business models or technologies. How will we handle integrated service providers? What about international revenues? The treatment of associated services needs clarification.

Thirdly, the implementation sequence is an issue. We are being asked to approve penalty calculations before seeing the enforcement guidelines. There is no impact assessment, so we cannot evaluate potential consequences. I understand that the post-implementation review is not scheduled until 2026, and there is no clear mechanism for adjusting the framework if problems emerge. The interaction with the existing penalty regime needs clarification.

There are also technical concerns that need some attention. The switch from “notified provider” to “person” in the 2003 order, as a result of this SI, needs rather more explanation. The calculation method for continuing breaches is not fully detailed, there is no specific provision for group companies or complex corporate structures and the treatment of joint ventures and partnerships remains unclear.

Finally, I hope that, in broad terms, the Minister can give us an update on progress on the removal of equipment covered by the Telecommunications (Security) Act 2021. That was mandated by the Act; I know it is under way but it is not yet complete.

This is about not merely technical calculations but creating an effective deterrent to the telecoms industry, while ensuring fair and practical enforcement of important security measures. Getting these rules right is essential for both national security and our telecoms sector. I look forward to the Minister’s response on these points.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I thank the Minister for bringing this important SI forward today and for setting it out so clearly and briefly. I also thank the noble Lord, Lord Clement-Jones. He made a range of interesting points: in particular, the point on timing was well made, and I look forward to hearing the Minister’s answers on that. This instrument seeks to implement provisions relating to the enforcement of designated vendor directions—DVDs—which form part of the broader framework established under the Telecommunications (Security) Act 2021. That Act, introduced under the previous Government, was designed to strengthen the security and resilience of the UK’s telecommunications networks, particularly in response to emerging national security risks.

We all know only too well that one of the most prominent issues at the forefront of this framework has been the removal of high-risk vendors, such as Huawei, from UK telecommunications infrastructure. Huawei’s involvement in the UK’s 5G rollout has long been a point of debate, with growing concerns about national security risks tied to its equipment. This SI therefore provides a mechanism for enforcing the penalties that may be applied to public communications providers —PCPs—that fail to comply with the DVDs to ensure that the UK’s telecommunications infrastructure remains secure from undue foreign influence.

The primary change introduced by this SI is the formalisation of the penalties regime for public communications providers that fail to comply with the conditions outlined in DVDs. It establishes a framework for calculating and enforcing penalties that may be imposed by the Secretary of State. The Secretary of State retains discretion in imposing penalties, but they must be applied in a proportionate manner. In considering penalties, the severity of the breach, the culpability of the provider and the broader implications for the sector must all be taken into account. The aim is to ensure compliance with DVDs while protecting the integrity of the UK’s national infrastructure.

However, while the objectives of this instrument are understood, this debate offers a good opportunity to scrutinise some of the specifics a little, particularly with regard to the proportionality of penalties and the potential economic consequences for the sector. It is with that in mind that I shall raise questions in just three areas regarding the provisions set out in this instrument.

First, the SI grants the Secretary of State significant discretion in the imposition of penalties. Of course, we recognise the value of flexibility here, but there is legitimate concern that this discretion may result in inconsistent enforcement across different public communications providers. Can the Minister assure us that transparency and accountability will be maintained throughout this process? How will the Government ensure that the application of penalties is fair and consistent, particularly when considering the varying size and scope of telecoms providers?

Further to this, can the Minister clarify how the penalties will be calculated? I echo the questions asked by the noble Lord, Lord Clement-Jones, particularly in cases where a breach does not pose an immediate or severe national security threat. Do the Government anticipate that penalties will be tiered with lesser fines for breaches that do not substantially compromise national security? Can the Minister further explain how such decisions will be communicated to the public and to industry to ensure transparency?

Secondly, providers are required to remove Huawei equipment from the UK’s 5G networks by 2027. This is, of course, a significant and costly task for telecom providers. Given these financial challenges, will the penalties for non-compliance take into account the costs already incurred by providers in replacing Huawei’s technology? Will the penalties be adjusted to reflect the substantial financial burden that these providers are already facing in removing Huawei equipment from their networks? Thirdly, where PCPs have been issued with a DVD, this can be a long and demanding process. How are the Government going to keep track of progress? What progress reports can be shared with Parliament and the public?

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their valuable contributions to this debate. We believe that legislative certainty is important, which is why we are seeking to resolve potential ambiguity by making this instrument at the earliest opportunity. This SI will ensure that important decisions on national security, specifically the enforcement of national security powers introduced by the Telecommunications (Security) Act, have clear rules underpinning them.

I will now have a go at answering the questions raised in the debate. The noble Lord, Lord Clement-Jones, asked about the three-year gap and why the SI was not taken forward earlier. I should thank Secondary Legislation Scrutiny Committee clerks for asking for clarification on the operability of the regime. The system has not been inoperable for four years. The Secretary of State can and has used their powers to monitor compliance with a direction under the current rules. The Secretary of State could have taken enforcement action without this SI being in place. The 2003 order could have applied for the purpose of enforcement of a designated vendor direction. However, there is some ambiguity concerning whether the rules set out in the 2003 order can apply to the enforcement of a designated vendor direction. This could have left enforcement action imposing a penalty on a provider vulnerable to legal challenge. We are therefore making an SI to ensure that there is legal certainty and clarity when penalties are imposed, and that position was set out in a letter to the Secondary Legislation Scrutiny Committee clarifying that.

The noble Lord, Lord Clement-Jones, also asked about the lack of consultation, but this is a technical clarification for rules that were already in operation. He asked about how turnover would be calculated. It will be done in conformity with the accounting practices and principles that are generally accepted in the United Kingdom. The turnover will be limited to the amount derived by that provider from the relevant business after deduction of sales rebates, value added tax and other taxes directly related to turnover. If the provider’s relevant business consists of two or more undertakings that each prepare accounts, then the turnover should be calculated by adding together the turnover of each undertaking. Any aid granted by a public body to a provider should be included in the calculation of turnover if the provider is a recipient of the aid and if that is directly linked to the carrying out by that provider of the relevant business. The business activities to be included in the turnover calculation for a provider are as follows: the provision of public electronic communications network; the provision of the public electronic communication of services; and the making available of facilities that are associated with facilities by reference to such a network or service.

The noble Lord, Lord Clement-Jones, asked about the removal of equipment and the progress report on that. Using the powers provided by the Telecommunications (Security) Act, the former Secretary of State for Digital, Culture, Media and Sport issued a designation notice to Huawei and a designated vendor direction to 35 providers in October 2022. The direction gives 12 specific requirements for telecom providers’ use of Huawei equipment. The previous Secretary of State decided that these legal controls on the use of Huawei equipment or services were necessary and proportionate to the national security risks they were designated to mitigate. The UK is now on a path towards the complete removal of Huawei from its 5G networks by the end of 2027.

The noble Viscount, Lord Camrose, asked whether the application was being applied in a fair and consistent way. I would say that this was an evidence-based decision, reflecting the national security risk. The designation notice issued to Huawei set out the reasons why the use of its equipment is viewed as a national security risk; it includes concerns about, among other things, corporate control, cybersecurity and engineering quality. This action builds on long-standing advice from the National Cyber Security Centre and the Government on the use of Huawei equipment in UK public tele- communications networks.

The noble Viscount asked about the cost to business of removing this equipment. The Government have estimated that the removal of Huawei equipment due to the designated vendor directions will cost providers up to £2 billion in total.

The noble Viscount also asked how the Secretary of State monitors compliance with a direction. The Communications Act 2003, as amended by the Telecommunications (Security) Act 2021, provides the Secretary of State with powers enabling the monitoring and enforcement of requirements imposed in designated vendor directions. The Secretary of State is responsible for determining compliance with a direction, based on evidence provided by the industry and Ofcom. The Secretary of State may give Ofcom a direction requiring Ofcom to monitor providers’ progress in complying with the direction and to report to the Secretary of State to inform their assessment of compliance. The former Secretary of State received Ofcom’s report in spring 2024 on the removal of Huawei from relevant providers’ core network functions, and that ongoing appraisal continues.

I hope that I have answered all the questions that were asked. If I have not answered on something that is very technical, I can write to noble Lords, of course. In the meantime, I hope noble Lords agree on the importance of introducing this instrument to ensure legislative certainty and therefore agree that enforcement through these powers should be introduced as swiftly as possible.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Is the Minister confident that the 2027 deadline will be met; that no vendor, purchaser or telecoms company will be caught by the Act; that no fines will be levied; and that what we are talking about today is, therefore, entirely theoretical?

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

While the Minister is working on her answer, perhaps she could include in that something about how progress against the delivery of these objectives will be reported to Parliament, potentially —and, indeed, to the public.

Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Lord Clement-Jones Excerpts
Monday 24th February 2025

(3 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Moved by
Lord Clement-Jones Portrait Lord Clement-Jones
- View Speech - Hansard - -

At end insert “but that this House regrets that the Regulations do not impose duties available under the parent Act on small, high-risk platforms where harmful content, often easily accessible to children, is propagated; calls on the Government to clarify which smaller platforms will no longer be covered by Ofcom’s illegal content code and which measures they will no longer be required to comply with; and calls on the Government to withdraw the Regulations and establish a revised definition of Category 1 services.”

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I am very pleased to see the Minister back in her place. I thank her for her introduction to this statutory instrument. Her disappointment at my tabling this regret amendment is exceeded only by my own disappointment at the SI. However, I hope that she will provide the antidote to the Government’s alarming tendency to pick unnecessary fights on so many important issues—a number of them overseen by her department.

Those of us who were intimately involved with its passage hoped that the Online Safety Act would bring in a new era of digital regulation, but the Government’s and Ofcom’s handling of small but high-risk platforms threatens to undermine the Act’s fundamental purpose of creating a safer online environment. That is why I am moving this amendment, and I am very grateful to all noble Lords who are present and to those taking part.

The Government’s position is rendered even more baffling by their explicit awareness of the risks. Last September, the Secretary of State personally communicated concerns to Ofcom about the proliferation of harmful content, particularly regarding children’s access. Despite this acknowledged awareness, the regulatory framework remains fundamentally flawed in its approach to platform categorisation.

The parliamentary record clearly shows that cross-party support existed for a risk-based approach to platform categorisation, which became enshrined in law. The amendment to Schedule 11 from the noble Baroness, Lady Morgan—I am very pleased to see her in her place—specifically changed the requirement for category 1 from a size “and” functionality threshold to a size “or” functionality threshold. This modification was intended to ensure that Ofcom could bring smaller, high-risk platforms under appropriate regulatory scrutiny.

Subsequently, in September 2023, on consideration of Commons amendments, the Minister responsible for the Bill, the noble Lord, Lord Parkinson—I am pleased to see him in his place—made it clear what the impact was:

“I am grateful to my noble friend Lady Morgan of Cotes for her continued engagement on the issue of small but high-risk platforms. The Government were happy to accept her proposed changes to the rules for determining the conditions that establish which services will be designated as category 1 or 2B services. In making the regulations, the Secretary of State will now have the discretion to decide whether to set a threshold based on either the number of users or the functionalities offered, or on both factors. Previously, the threshold had to be based on a combination of both”.—[Official Report, 19/9/23; col. 1339.]


I do not think that could be clearer.

This Government’s and Ofcom’s decision to ignore this clear parliamentary intent is particularly troubling. The Southport tragedy serves as a stark reminder of the real-world consequences of inadequate online regulation. When hateful content fuels violence and civil unrest, the artificial distinction between large and small platforms becomes a dangerous regulatory gap. The Government and Ofcom seem to have failed to learn from these events.

At the heart of this issue seems to lie a misunderstanding of how harmful content proliferates online. The impact on vulnerable groups is particularly concerning. Suicide promotion forums, incel communities and platforms spreading racist content continue to operate with minimal oversight due to their size rather than their risk profile. This directly contradicts the Government’s stated commitment to halving violence against women and girls, and protecting children from harmful content online. The current regulatory framework creates a dangerous loophole that allows these harmful platforms to evade proper scrutiny.

The duties avoided by these smaller platforms are not trivial. They will escape requirements to publish transparency reports, enforce their terms of service and provide user empowerment tools. The absence of these requirements creates a significant gap in user protection and accountability.

Perhaps the most damning is the contradiction between the Government’s Draft Statement of Strategic Priorities for Online Safety, published last November, which emphasises effective regulation of small but risky services, and their and Ofcom’s implementation of categorisation thresholds that explicitly exclude these services from the highest level of scrutiny. Ofcom’s advice expressly disregarded—“discounted” is the phrase it used—the flexibility brought into the Act via the Morgan amendment, and advised that regulations should be laid that brought only large platforms into category 1. Its overcautious interpretation of the Act creates a situation where Ofcom recognises the risks but fails to recommend for itself the full range of tools necessary to address them effectively.

This is particularly important in respect of small, high-risk sites, such as suicide and self-harm sites, or sites which propagate racist or misogynistic abuse, where the extent of harm to users is significant. The Minister, I hope, will have seen the recent letter to the Prime Minister from a number of suicide, mental health and anti-hate charities on the issue of categorisation of these sites. This means that platforms such as 4chan, 8chan and Telegram, despite their documented role in spreading harmful content and co-ordinating malicious activities, escaped the full force of regulatory oversight simply due to their size. This creates an absurd situation where platforms known to pose significant risks to public safety receive less scrutiny than large platforms with more robust safety measures already in place.

The Government’s insistence that platforms should be “safe by design”, while simultaneously exempting high-risk platforms from category 1 requirements based solely on size metrics, represents a fundamental contradiction and undermines what we were all convinced—and still are convinced—the Act was intended to achieve. Dame Melanie Dawes’s letter, in the aftermath of Southport, surely gives evidence enough of the dangers of some of the high-risk, smaller platforms.

Moreover, the Government’s approach fails to account for the dynamic nature of online risks. Harmful content and activities naturally migrate to platforms with lighter regulatory requirements. By creating this two-tier system, they have, in effect, signposted escape routes for bad actors seeking to evade meaningful oversight. This short-sighted approach could lead to the proliferation of smaller, high-risk platforms designed specifically to exploit these regulatory gaps. As the Minister mentioned, Ofcom has established a supervision task force for small but risky services, but that is no substitute for imposing the full force of category 1 duties on these platforms.

The situation is compounded by the fact that, while omitting these small but risky sites, category 1 seems to be sweeping up sites that are universally accepted as low-risk despite the number of users. Many sites with over 7 million users a month—including Wikipedia, a vital source of open knowledge and information in the UK—might be treated as a category 1 service, regardless of actual safety considerations. Again, we raised concerns during the passage of the Bill and received ministerial assurances. Wikipedia is particularly concerned about a potential obligation on it, if classified in category 1, to build a system that allows verified users to modify Wikipedia without any of the customary peer review.

Under Section 15(10), all verified users must be given an option to

“prevent non-verified users from interacting with content which that user generates, uploads or shares on the service”.

Wikipedia says that doing so would leave it open to widespread manipulation by malicious actors, since it depends on constant peer review by thousands of individuals around the world, some of whom would face harassment, imprisonment or physical harm if forced to disclose their identity purely to continue doing what they have done, so successfully, for the past 24 years.

This makes it doubly important for the Government and Ofcom to examine, and make use of, powers to more appropriately tailor the scope and reach of the Act and the categorisations, to ensure that the UK does not put low-risk, low-resource, socially beneficial platforms in untenable positions.

There are key questions that Wikipedia believes the Government should answer. First, is a platform caught by the functionality criteria so long as it has any form of content recommender system anywhere on UK-accessible parts of the service, no matter how minor, infrequently used and ancillary that feature is?

Secondly, the scope of

“functionality for users to forward or share regulated user-generated content on the service with other users of that service”

is unclear, although it appears very broad. The draft regulations provide no guidance. What do the Government mean by this?

Thirdly, will Ofcom be able to reliably determine how many users a platform has? The Act does not define “user”, and the draft regulations do not clarify how the concept is to be understood, notably when it comes to counting non-human entities incorporated in the UK, as the Act seems to say would be necessary.

The Minister said in her letter of 7 February that the Government are open to keeping the categorisation thresholds under review, including the main consideration for category 1, to ensure that the regime is as effective as possible—and she repeated that today. But, at the same time, the Government seem to be denying that there is a legally robust or justifiable way of doing so under Schedule 11. How can both those propositions be true?

Can the Minister set out why the regulations, as drafted, do not follow the will of Parliament—accepted by the previous Government and written into the Act—that thresholds for categorisation can be based on risk or size? Ofcom’s advice to the Secretary of State contained just one paragraph explaining why it had ignored the will of Parliament—or, as the regulator called it, the

“recommendation that allowed for the categorisation of services by reference exclusively to functionalities and characteristics”.

Did the Secretary of State ask to see the legal advice on which this judgment was based? Did DSIT lawyers provide their own advice on whether Ofcom’s position was correct, especially in the light of the Southport riots?

How do the Government intend to assess whether Ofcom’s regulatory approach to small but high-harm sites is proving effective? Have any details been provided on Ofcom’s schedule of research about such sites? Do the Government expect Ofcom to take enforcement action against small but high-harm sites, and have they made an assessment of the likely timescales for enforcement action?

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, if I have not covered any issues, I will of course write to noble Lords to clarify any matters that are outstanding.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I shall be extremely brief. I thank all noble Lords who have contributed this evening. The noble Lord, Lord Stevenson, used the expression “emotions raised”. That is exactly what this regret amendment has done. There is real anger about the way in which this statutory instrument has been put together. I think many noble Lords who were involved in the Act were extremely proud of our work, as has been expressed.

The Minister has made a valiant attempt, but I am afraid that she has been given a hospital pass. It is quite clear that the Secretary of State did not have to accept the advice from Ofcom. Its advice about functionalities, as the noble Baroness, Lady Kidron, made absolutely clear, and the evidence that the noble Lord, Lord Russell of Liverpool, put forward, not to mention the evidence from the anti-Semitism foundation, all indicate that there is considerable belief around this House that we are not dealing with the high-risk but smaller sites such as Telegram, 8chan and 4chan.

In these circumstances, as I believe is accepted by many noble Lords across the House, the Government have got this completely wrong and it needs rethinking. Therefore, I would like to test the opinion of the House.

Copyright and Performances (Application to Other Countries) (Amendment) (No. 2) Order 2024

Lord Clement-Jones Excerpts
Wednesday 12th February 2025

(3 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Moved by
Lord Clement-Jones Portrait Lord Clement-Jones
- View Speech - Hansard - -

That this House regrets that the Copyright and Performances (Application to Other Countries) (Amendment) (No. 2) Order 2024, laid before the House on 13 November 2024 (SI 2024/1124), did not involve consultation on Option 0A, which will result in inequitable treatment of performers on sound recordings based on their nationality.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I will start by saying that I am delighted that the Minister is treating this as one of her first engagements back in the House; how flattering to have her here at this time of day for a regret Motion. I also want to put on record my thanks to the Minister, Feryal Clark, who has taken the trouble to engage. Time will tell whether that engagement bears fruit, as we will see.

We on these Benches agree with Equity—the actors’ union—and the Musicians’ Union that this statutory instrument should be withdrawn due to several significant concerns regarding its fairness, its consultation process and its potential impact on performers, particularly those from the United States. The key arguments against the statutory instrument revolve around the implementation of option 0A, which maintains the status quo for producers of sound recordings while extending public performance rights to foreign performers only if their producer is a UK national or based in a country that is a signatory to the Rome convention, thereby excluding performers from countries that are not signatories, such as the United States. This option was not explicitly consulted on, and it creates an inequitable system of remuneration for performers.

The consultation presented four options, numbered 0 to 3, but option 0A emerged after the consultation. The Government have acknowledged that option 0A is a new option. However, the Government’s claim that they carefully considered all views is wrong, as a key policy option was developed and implemented without input from key stakeholders. Impacted organisations were not given an opportunity to formally submit their views on option 0A. As such, this lack of consultation raises concerns about the transparency and fairness of the decision-making process, and in fact undermines it.

As I have mentioned, option 0A creates a system where some foreign performers receive public performance rights based on national treatment, while others, specifically US performers, are denied those rights, based on the principle of material reciprocity. This means that US performers will not receive equitable remuneration for their work in the UK, even though their recordings are being used. US producers will continue to enjoy protection and equitable remuneration in the UK, while US performers on the same recordings are denied these rights. This disparity is difficult to justify and is clearly discriminatory.

It will also extend public performance rights to some additional foreign performers who will qualify through their producer, regardless of whether their nation offers material reciprocity to UK performers. The situation is further complicated by the fact that performers from countries such as Australia and New Zealand, which also do not offer material reciprocity to UK performers, will still receive public performance rights in the UK. This inconsistency makes the policy arbitrary and unjust. The Government’s approach effectively singles out US performers for less favourable treatment.

The Government further justifies their position by arguing that expanding performers’ rights would negatively impact the UK music sector. The Government’s decision to not expand performers’ eligibility is partly based on the argument that UK affiliates of overseas record labels retain a significant proportion of the revenues attributable to foreign rights holders. Specifically, the British Phonographic Industry, BPI, claimed that UK labels retain 30% of the revenues collected in the UK on behalf of foreign affiliates, and that any reduction in revenues for US record labels would mean less money for the UK music sector.

Little evidence for this claim has been made and the Musicians’ Union disputes it, arguing that this paints an “exaggerated, bleak picture”, that the UK and US operations remain financially separate in practice, and that it cannot find any workings in the BPI’s redacted submission to the consultation. This suggests that the Government’s financial justification is based on flawed information, not solid evidence. Smaller independent record companies have, by contrast, expressed that the current situation is unfair and supported option 1, which would expand performers’ eligibility for remuneration.

The Government also claim that denying US performers public performance rights is intended to encourage the US to adopt material reciprocity. However, the revised economic impact assessment acknowledges that this is unlikely to influence US policy. In the view of Equity, a more effective strategy would be to offer US performers rights for a limited term, such as 10 years, and then use that as leverage to negotiate material reciprocity with the US Government. This approach would provide US performers with fair compensation while creating an incentive for the US to reciprocate. The current strategy effectively withholds remuneration from performers as a negotiating tactic, while a more effective strategy will still guarantee that performers get paid for their work.

The Government’s policy is intended to ensure that UK law meets its international obligations under the Rome convention and the WIPO Performances and Phonograms Treaty. However, the implementation of option 0A undermines the spirit of these treaties by creating a system of unequal treatment for performers based on their nationality. Equity believes that all foreign performers in countries that qualify for public protection rights should benefit from the same level of protection. It considers the current approach to be unethical, and we agree. As a matter of principle, performers should be remunerated for their work. This option leaves some performers benefiting from national treatment, ignoring lack of material reciprocity, while US performers are denied remuneration on the basis of material reciprocity.

The Government have stated that the current statutory instrument corresponds closely to option 0, which was the status quo option. However, the Intellectual Property Office itself stated in its revised impact assessment that

“Parliament has passed the CPTPP Act. The CPTPP Act contains measures that will, when it comes into force (expected in December 2024), expand eligibility for performers’ rights generally, in a way that approximates the effects of Option 1. Doing nothing therefore now means allowing the law to change in a way similar to that set out in Option 1, rather than maintaining the effect of existing law”.

Therefore, the Government’s claim that they are maintaining the status quo is actually incorrect, because the status quo is already changing due to the CPTPP Act, which has now come into effect. The Government are claiming to maintain the status quo, but that status quo is already changing due to that legislation.

Equity, SAG-AFTRA, the Musicians’ Union and PPL have all raised concerns regarding the Government’s proposed course of action. This statutory instrument should be withdrawn due to a flawed consultation, the unfair treatment of US performers, the disputed financial claims, its ineffective approach to achieving material reciprocity, the ethical concerns and the contradictions with existing legislation. The Government should reconsider their approach, consult on both option 0A and option 1 and implement a system that provides equitable remuneration for all performers. I beg to move.

Earl of Clancarty Portrait The Earl of Clancarty (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support the noble Lord, Lord Clement-Jones, on this regret Motion. I will be brief, not least because the noble Lord has covered the ground so well. I too thank the Minister, Feryal Clark at DSIT, for our meeting with her on Monday on this issue. I also thank Equity for its briefing on this and for alerting us to this concern. I very much welcome the Minister back to her place.

Ultimately, this is about fairness and consistency—or, perhaps more to the point, unfairness and inconsistency —and about mutual benefits, which this Government should strive toward in every area of our dealings with others, not least in the case of the arts and creative industries. I have become a great believer in the word “mutual”. I prefer it now over “reciprocal”, which the public grasp less, I think—they find it too abstract. But we all understand, or have a better chance of understanding, what “mutual benefits” means. For example—forgive me if I digress slightly—a new poll finds that over 80% of the public are in favour of mutual free movement in Europe, because that becomes something that is immediately understandable, while of course some of us have been banging the drum for reciprocity for years and not getting very far. The language we use to describe these things is hugely important.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for her response. I do not doubt her motives at all; it is just the outcome that we are not happy with. I thank those who have spoken in favour of this regret Motion. The noble Earl, Lord Clancarty, talked about US session musicians, who are an important aspect of this. Clearly, they are being discriminated against if they are working with musicians from countries that are reciprocating and paying our musicians. This seems to be unfairly singling out those session musicians in those circumstances.

My noble friend Lady Featherstone talked about fair play and the risk of US retaliation. The Government may have the right motives, but I do not think they have quite come to terms with what a US Administration might do in this regard.

The noble Lord, Lord Markham, rightly said that any system needs to have the confidence of those who are supposed to benefit from it. I thought that his injunction to tread carefully in these circumstances was very important.

I do not propose to put this Motion to a vote; the SI has already gone through on the negative procedure. However, we are not wholly reassured. I very much hope that the Government will initiative a dialogue with Equity. When I raised with Minister Clark whether they had actually met Equity to discuss this SI, it was interesting that they had not. There are other issues that Equity would, I am sure, very much want to talk about, such as synthesisation of performances, in relation to AI—that is probably another area. It seemed rather extraordinary that a Labour Government had not properly engaged with Equity in the last six months. I very much hope that the Minister will be able to take that back to the department and reinforce the desire to meet with Equity and the Musicians’ Union.

The Government have been unduly influenced by the figures from the BPI. I do not recognise the figures that the Minister maintained about the income which would be forgone if another option had been taken. As I said when I introduced the Motion, those figures are very much disputed by the MU. We have not seen any real workings that prove that that would be the loss of income, but we can argue the toss on that.

I very much hope that the Government will at least take back from this regret Motion that in producing an option at the last minute, however creative it may be—I would not deny that creativity is a useful thing to have in the department—you can be slightly overly creative if you are not consulting on a particular option. I thought it was a neat piece of speechwriting but not necessarily to be desired. It is the kind of thing that would go to judicial review if a commercial organisation was involved in this SI. If this was BT, or another major telecom company, disputing an SI in which an option had not been consulted on, they would take this kind of thing to court. As it happens, this is about artists and unions, so they are not going to do that; they are going to go through the political process. I welcome that, but it nevertheless shows the fragility of the decision that has been taken in this case. I beg leave to withdraw the Motion.

Motion withdrawn.

Data (Use and Access) Bill [HL]

Lord Clement-Jones Excerpts
Moved by
138: After Clause 92, insert the following new Clause—
“Code on processing personal data in education where it concerns a child or pupil(1) The Information Commissioner must consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the United Kingdom, within the meaning of the Education Act 1996, the Education (Scotland) Act 1996, and the Education and Libraries (Northern Ireland) Order 1986; and on standards on the rights of those children as data subjects which are appropriate to children’s capacity and stage of education.(2) For the purposes of subsection (1), the rights of data subjects must include—(a) measures related to responsibilities of the controller, data protection by design and by default, and security of processing,(b) safeguards and suitable measures with regard to automated decision-making, including profiling and restrictions,(c) the rights of data subjects including to object to or restrict the processing of their personal data collected during their education, including any exemptions for research purposes, and(d) matters related to the understanding and exercising of rights relating to personal data and the provision of education services.”Member’s explanatory statement
This amendment requires the Commission to consult on, prepare and publish a Code of Practice on standards to be followed in relation to the collection, processing, publication and other dissemination of personal data concerning children and pupils in connection with the provision of education services in the UK.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.

Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:

“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.


The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,

“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.

The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:

“Parents have a prior right to choose the kind of education that shall be given to their children.”


A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.

Obligations specific to children’s data, especially

“solely automated decision-making and profiling”

and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.

The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that

“children have the right to be heard and participate in decisions affecting them”.

They recognise that

“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”

Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.

Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:

“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”


A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.

Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.

Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.

I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.

Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.

Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.

Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.

Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.

Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.

A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, Amendment 138 tabled by the noble Lord, Lord Clement-Jones, and Amendment 141, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, would both require the ICO to publish a code of practice for controllers and processors on the processing of personal data by educational technologies in schools.

I say at the outset that I welcome this debate and the contributions of noble Lords on this important issue. As various noble Lords have indicated, civil society organisations have also been contacting the Department for Science, Innovation and Technology and the Department for Education directly to highlight their concerns about this issue. It is a live issue.

I am grateful to my noble friend Lord Knight, who talked about some of the important and valuable contributions that technology can play in supporting children’s development and guiding teaching interventions. We have to get the balance right, but we understand and appreciate that schoolchildren, parents and schoolteachers must have the confidence to trust the way that services use children’s personal data. That is at the heart of this debate.

There is a lot of work going on, on this issue, some of which noble Lords have referred to. The Department for Education is already exploring ways to engage with the edtech market to reinforce the importance of evidence-based quality products and services in education. On my noble friend Lord Knight’s comments on AI, the Department for Education is developing a framework outlining safety expectations for AI products in education and creating resources for teachers and leaders on safe AI use.

I recognise why noble Lords consider that a dedicated ICO code of practice could help ensure that schools and edtech services are complying with data protection legislation. The Government are open-minded about exploring the merits of this further with the ICO, but it would be premature to include these requirements in the Bill. As I said, there is a great deal of work going on and the findings of the recent ICO audits of edtech service providers will help to inform whether a code of practice is necessary and what services should be in scope.

I hope that we will bear that in mind and engage on it. I would be happy to continue discussions with noble Lords, the ICO and colleagues at the Department for Education, outside of the Bill’s processes, about the possibility of future work on this, particularly as the Secretary of State has powers under the Data Protection Act 2018 to require the ICO to produce new statutory codes, as noble Lords know. Considering the explanation that I have given, I hope that the noble Lord, Lord Clement-Jones, will consider withdrawing his amendment at this stage.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her response and all speakers in this debate. On the speech from the noble Lord, Lord Knight, I entirely agree with the Minister and the noble Viscount, Lord Camrose, that it is important to remind ourselves about the benefits that can be achieved by AI in schools. The noble Lord set out a number of those. The noble Lord, Lord Russell, also reminded us that this is not a purely domestic issue; it is international across the board.

However, all noble Lords reminded us of the disbenefits and risks. In fact, the noble Lord, Lord Knight, used the word “dystopian”, which was quite interesting, although he gets very close to science fiction sometimes. He said that

“we have good reason to be concerned”,

particularly because of issues such as the national pupil database, where the original purpose may not have been fulfilled and was, in many ways, changed. He gave an example of procurement during Covid, where the choice was either Google or Microsoft—Coke or Pepsi. That is an issue across the board in competition law, as well.

There are real issues here. The noble Lord, Lord Russell, put it very well when he said that there is any number of pieces of guidance for schools but it is important to have a code of conduct. We are all, I think, on the same page in trying to find—in the words of the noble Baroness, Lady Kidron—a fairer and more equitable set of arrangements for children in schools. We need to navigate our way through this issue; of course, organisations such as Defend Digital Me and 5rights are seriously working on it.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to take part in today’s Committee proceedings. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 156A, I will also speak to Amendment 156B, and I thank the noble Lord, Lord Clement-Jones, for co-signing them.

We live in extraordinarily uncertain times, domestically and internationally. In many ways, it has always been thus. However, things are different and have accelerated, not least in the last two decades, because of the online environment and the digital selves that we find ourselves interacting with in a world that is ever changing moment by moment. These amendments seek to update an important statute that governs critical elements of how cybersecurity professionals in this nation seek to keep us all safe in these extraordinarily difficult times.

The Computer Misuse Act 1990 was introduced to defend telephony exchanges at a time when 0.5% of us were online. If that was the purpose of the Act—the statute when passed—that alone would suggest that it needs an update. Who among us would use our smartphone if we had had it for 34 years? Well, we could not—the iPhone has been around only since 2007. This whole world has changed profoundly in the last 20 years, never mind the last 34. It is not just that the Act needs to be updated because it falls short of how society and technology have changed in those intervening years; it needs, desperately and urgently, to be updated because it is currently putting every citizen in this nation at risk for want of being amended. This is the purpose of Amendments 156A and 156B.

The Computer Misuse Act 1990 is not only out of date but inadvertently criminalising the cybersecurity professionals we charge with the job of keeping us all safe. They oftentimes work, understandably, under the radar, behind not just closed but locked doors, doing such important work. Yet, for want of these amendments, they are doing that work, all too often, with at least one hand tied behind their back.

Let us take just two examples: vulnerability research and threat intelligence assessment and analysis. Both could find that cybersecurity professional falling foul of the provisions of the CMA 1990. Do not take my word for it: look to the 2024 annual report of the National Cyber Security Centre, which rightly and understandably highlights the increasing gap between the threats we face and its ability, and the ability of the cybersecurity professionals community, to meet those threats.

These amendments, in essence, perform one simple but critical task: to afford a legal defence for legitimate cybersecurity activities. That is all, but it would have such a profound impact for those whom we have asked to keep us safe and for the safety they can thus deliver to every citizen in our society.

Where is the Government’s work on updating the Computer Misuse Act 1990 in this respect? Will the Government take this opportunity to accept these amendments? Do they believe that these amendments would provide a materially positive benefit to our cybersecurity professionals and thus to our nation, and, if so, why would they not take this first opportunity to enact these amendments to this data Bill?

It is not time; it is well over time that these amendments become part of our law. If not now, when? If not these amendments, which amendments? If they do not accept these amendments, what will the Government say to all those people who will continue to be put in harm’s way for want of these protective provisions being passed? It is time to pass these amendments and give our cybersecurity professionals the tools they need. It is time, from the legislative perspective, to keep them safe so that they can do the self-same thing for all of us. It is time to cyber up. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I was delighted to see these amendments tabled by the noble Lord, Lord Holmes. He, the noble Lord, Lord Arbuthnot, and I, along with many other parliamentarians, have long argued for changes to the Computer Misuse Act. For context, the original Act was created largely in response to a famous incident in which professional hackers and a technology journalist broke into British Telecom’s Prestel system in the mid-1980s. The Bill received Royal Assent in June 1990, barely two months after Tim Berners-Lee and CERN made the world wide web publicly available for the first time. Who remembers Prestel? Perhaps this is the wrong House in which to ask that question.

As the noble Lord, Lord Holmes, explained, there is no statutory public interest defence in the Act. This omission creates a legal risk for cybersecurity researchers and professionals conducting legitimate activities in the public interest. The Post Office Horizon scandal demonstrated how critical independent computer system investigation is for uncovering systemic problems and highlighted the need for protected legal pathways for researchers and investigators to examine potentially flawed systems.

I am delighted that the noble Lord, Lord Vallance, is here for this set of amendments. His Pro-innovation Regulation of Technologies Review explicitly recommends incorporating such a defence to provide stronger legal protections for cybersecurity researchers and professionals engaged in threat intelligence research. This recommendation was rooted in the understanding that such a defence would have, it said,

“a catalytic effect on innovation”

within the UK’s cybersecurity sector, which possesses “considerable growth potential”.

--- Later in debate ---
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, I rise briefly but strongly to support my noble friend Lord Holmes. The CyberUp campaign has been banging this drum for a long time now. I remember taking part in the debates in another place on the Computer Misuse Act 34 years ago. It was the time of dial-up modems, fax machines and bulletin boards. This is the time to act, and it is the opportunity to do so.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we ought to be mindful and congratulate the noble Lord on having been parliamentarian of the year as a result of his campaigning activities.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

My Lords, it has taken 34 years.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

Could the Minister say a few words on some of those points of discourse and non-consensus, to give the Committee some flavour of the type of issues where there is no consensus as well as the extent of the gap between some of those perspectives?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Just to follow up, have the Government formally responded to the original review from the noble Lord, Lord Vallance? That would be very helpful as well, in unpacking what were clearly extremely well-informed recommendations. It should, no doubt, be taken extremely seriously.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

Yes, the Government accepted the recommendations in full.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Before the Minister sits down or stands up or whatever the appropriate phrase should be, I very much hope that, since the previous Government gave that indication, this Government will take that as a spur to non-glacial progress. I hope that at least the speed might get up to a number of miles per hour before too long.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who have taken part in this important debate and, indeed, the Minister for her thoughtful response. We find ourselves in a position of extraordinary good fortune when it comes to these and many other amendments, not least in the area of artificial intelligence. We had a first-class report from the then Sir Patrick Vallance as CSA. It is not often in life that in a short space of time one is afforded the opportunity in government of bringing much of that excellent work into being through statute, regulation, codes and other guidance. I await further steps in this area.

There can barely be, in many ways, a more serious and pressing issue to be addressed. For every day that we delay, harms are caused. Even if the Government were only to do this on their growth agenda, much spoken of, this would have an economic benefit to the United Kingdom. It would be good to meet the Minister between Committee and Report to see if anything further can be done but, from my perspective and others, we will certainly be returning to this incredibly important issue. I beg leave to withdraw the amendment.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I would like to just make one comment on this group. I entirely agree with everything that has been said and, in particular, with the amendments in the name of the noble Baroness, Lady Kidron, but the one that I want to single out—it is why I am bothering to stand up—is Amendment 197, which says that the Secretary of State “must” implement this measure.

I was heavily scarred back in 2017 by the Executive’s refusal to implement Part 3 of the Digital Economy Act in order to protect our children from pornography. Now, nearly eight years later, they are still not protected. It was never done properly, in my opinion, in the then Online Safety Bill either; it still has not been implemented. I think, therefore, that we need to have a “must” there. We have an Executive who are refusing to carry out the issue from Parliament in passing the legislation. We have a problem, but I think that we can amend it by putting “must” in the Bill. Then, we can hold the Executive to account.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.

Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.

The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.

My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.

Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.

I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.

I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.

I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?

Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, I support Amendment 203 and, in particular, Amendments 211G and 211H from the noble Baroness, Lady Owen. I have little to add to what I said on Friday. I confess to my noble friend the Minister that, in my speech on Friday, I asked whether this issue would be in scope for this Bill, so maybe I gave the noble Baroness the idea. I pay tribute to her agility in being able to act quickly to get this amendment in and include something on audio, following the speech of the noble Baroness, Lady Gohir.

I hope that the Minister has similar agility in being able to readjust the Government’s position on this. It is right that this was an urgent manifesto commitment from my party at the last election. It fits entirely with my right honourable friend the Home Secretary’s efforts around violence against women and girls. We should accept and grab this opportunity to deliver quickly by working with the noble Baroness, Lady Owen, and others between now and Report to bring forward an amendment to the Bill that the whole House will support enthusiastically.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.

This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.

The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:

“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.


The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.

Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.

We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.

This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.

I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.

Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.

Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.

That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.

Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I very much support these amendments. I declare an interest as an owner of written copyright in the Good Schools Guide and as a father of an illustrator. In both contexts, it is very important that we get intellectual property right, as I think the Government recognised in what they put out yesterday. However, I share the scepticism of those who have spoken as to whether the Government’s ideas can be made to work.

It is really important that we get this straight. For those of us operating at the small end of the scale, IP is under continual threat from established media. I write maybe 10 or a dozen letters a year to large media outfits reminding them of the borders, the latest to the Catholic Herald—it appears not even the 10 commandments have force on them. But what AI can do is a huge measure more difficult to deal with. I can absolutely see, by talking to Copilot, that it has gone through my paywall and absorbed the contents of the Good Schools Guide, but who am I supposed to go at for this? Who has actually done the trespassing? Who is responsible for it? Where is the ownership? It is difficult to enforce copyright, even by writing a polite letter to someone saying, “Please don’t do this”. The Government appear to propose a system of polite letters saying, “Oh dear, it looks as if you might have borrowed my copyright. Please, can you give it back?”

This is not practically enforceable, and it will not result in people who care about IP locating their businesses here. Quite clearly, we do not have ownership of the big AI systems, and it is unlikely that we will have ownership of them—all that will be overseas. What we can do is create IP. If we produce a system where we do not defend the IP that we produce, then fairly rapidly, those IP creators who are capable of being mobile will go elsewhere to places that will defend their IP. It is something that a Government who are interested in growth really ought to be interested in defending. I hope that we will see some real progress in the course of the Bill going through the House.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I declare my AI interests as set out in the register. I will speak in support of Amendments 204, 205 and 206, which have been spoken to so inspiringly by the noble Baroness, Lady Kidron, and so well by the noble Lords, Lord Freyberg, Lord Lucas and Lord Hampton, the noble Earl, Lord Clancarty, and the noble Viscount, Lord Colville. Each demonstrated different facets of the issue.

I co-chair the All-Party Group on AI and chaired the AI Select Committee a few years ago. I wrote a book earlier this year on AI regulation, which had a namecheck from the noble Baroness, Lady Jones, at Question Time, which I was very grateful for. Before that, I had a career as an IP lawyer, defending copyright and creativity, and in this House, I have been my party’s creative industries spokesperson. The question of IP and the training of generative AI models is a key issue for me.

This is the case not just in the UK but around the world. Getty and the New York Times are suing in the United States, as are many writers, artists and musicians. It was at the root of the Hollywood actors’ and writers’ strikes last year. It is one thing to use the tech—many of us are AI enthusiasts—but it is another to be at the mercy of it.

Close to home, the FT has pointed out, using the index published by the creator of an unlicensed dataset called Books3, published online, that it is possible to identify that over 85 books written by 33 Members of the House of Lords have been pirated to train AI models from household names, such as Meta, Microsoft and Bloomberg. Although it is absolutely clear that we know that the use of copyrighted works to train AI models is contrary to UK copyright law, the laws around the transparency of these activities have not caught up. As we have heard, as well as using pirated e-books in their training data, AI developers scrape the internet for valuable professional journalism and other media, in breach of both the terms of service of websites and copyright law, to train commercial AI models. At present, developers can do this without declaring their identity, or they may use IP scraped to appear in a search index for the completely different commercial purpose of training AI models.

How can rights owners opt out of something that they do not know about? AI developers will often scrape websites or access other pirated material before they launch an LLM in public. This means that there is no way for IP owners to opt out of their material being taken before its inclusion in these models. Once used to train these models, the commercial value, as we have heard, has already been extracted from IP scraped without permission, with no way to delete data from these models.

The next wave of AI models responds to user queries by browsing the web to extract valuable news and information from professional news websites. This is known as retrieval-augmented generation—RAG. Without payment for extracting this commercial value, AI agents built by companies such as Perplexity, Google and Meta will, in effect, free-ride on the professional hard work of journalists, authors and creators. At present, such crawlers are hard to block. There is no market failure; there are well-established licensing solutions. There is no uncertainty around the existing law; the UK is absolutely clear that commercial organisations, including gen AI developers, must license the data that they use to train their large language models.

Here, as the Government’s intentions become clearer, the political, business and creative temperature is rising. Just this week, we have seen the creation of a new campaign, the Creative Rights in AI Coalition—CRAIC —across the creative and news industries and, recently, Ed Newton-Rex reached more than 30,000 signatories from among creators and creative organisations.

--- Later in debate ---
Lord Faulks Portrait Lord Faulks (Non-Afl)
- Hansard - - - Excerpts

The noble Lord has enormous experience in these areas and will be particularly aware of the legal difficulties in enforcing rights. Given what he said, with which I entirely agree—indeed, I agree with all the speakers in supporting these amendments—and given the extraordinary expense of litigating to enforce rights, how does he envisage there being an adequate system to allow those who have had their data scraped in the way that he describes to obtain redress or, rather, suitable remedies?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the noble Lord for that. He is anticipating a paragraph in my notes, which says that, although it is not set out in the amendments, robust enforcement of these provisions will be critical to their success. This includes oversight from an expert regulator that is empowered to issue significant penalties, including fines for non-compliance. There is a little extra work to do there, and I would very much like to see the Intellectual Property Office gain some teeth.

I am going to close. We are nearly at the witching hour, but it is clear that AI developers are seeking to use their lobbying clout—the noble Baroness, Lady Kidron, mentioned the Kool-Aid—to persuade the Government that new copyright law is required. Instead, this amendment would clarify that UK copyright law applies to gen AI developers. The creative industries, and noble Lords from across the House as their supporters, will rally around these amendments and vigorously oppose government plans for a new text and data- mining exception.

--- Later in debate ---
Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, if I may just interject, I have seen this happen not just in the Horizon scandal. Several years ago, the banks were saying that you could not possibly find out someone’s PIN and were therefore refusing to refund people who had had stuff stolen from them. It was not until the late Professor Ross Anderson, of the computer science department at Cambridge University, proved that they had been deliberately misidentifying to the courts which counter they should have been looking at, as to what was being read, and explained exactly how you could get the thing to default back to a different set of counters, that the banks eventually had to give way. But they went on lying to the courts for a long time. I am afraid that this is something that keeps happening again and again, and an amendment like this is essential for future justice for innocent people.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, it is a pity that this debate is taking place so late. I thank the noble Lord, Lord Arbuthnot, for his kind remarks, but my work ethic feels under considerable pressure at this time of night.

All I will say is that this is a much better amendment than the one that the noble Baroness, Lady Kidron, put forward for the Data Protection and Digital Information Bill, and I very strongly support it. Not only is this horrifying in the context of the past Horizon cases, but I read a report about the Capture software, which is likely to have created shortfalls that led to sub-postmasters being prosecuted as well. This is an ongoing issue. The Criminal Cases Review Commission is reviewing five Post Office convictions in which the Capture IT system could be a factor, so we cannot say that this is about just Horizon, as there are the many other cases that the noble Baroness cited.

We need to change this common law presumption even more in the face of a world in which AI use, with all its flaws and hallucinations, is becoming ever present, and we need to do it urgently.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for tabling her amendment. We understand its great intentions, which we believe are to prevent another scandal similar to that of Horizon and to protect innocent people from having to endure what thousands of postmasters have undergone and suffered.

However, while this amendment would make it easier to challenge evidence derived from, or produced by, a computer or computer system, we are concerned that, should it become law, this amendment could be misused by defendants to challenge good evidence. Our fear is that, in determining the reliability of such evidence, we may create a battle of the expert witnesses. This will not only substantially slow down trials but result in higher costs. Litigation is already expensive, and we would aim not to introduce additional costs to an already costly process unless absolutely necessary.

From our perspective, the underlying problem in the Horizon scandal was not that computer systems were critically wrong or that people were wrong, but that the two in combination drove the terrible outcomes that we have unfortunately seen. For many industries, regulations require firms to conduct formal systems validation, with serious repercussions and penalties should companies fail to do so. It seems to us that the disciplines of systems validation, if required for other industries, would be both a powerful protection and considerably less disruptive than potentially far-reaching changes to the law.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, having a system such as this would really focus the public sector on how we can generate more datasets. As I said earlier, education is an obvious one, but so is mobile phone data. All these companies have their licences. If a condition of the licence was that the data on how people move around the UK became a public asset, that would be hugely beneficial to policy formation. If we really understood how, why and when people move, we would make much better decisions. We could save ourselves huge amounts of money. We really ought to have this as a deep focus of government policy.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I have far too little time to do justice to this subject. We on these Benches welcome this amendment. It is entirely consistent with the sovereign health fund proposed by Future Care Capital and, indeed, with the proposals from the Tony Blair Institute for Global Change on a similar concept called the national data trust. Indeed, this concept formed part of our Liberal Democrat manifesto at the last general election, so of course I support the amendment.

It would be very useful to hear more about the national data library, including on its purpose and operation, as the noble Baroness, Lady Kidron, said. I entirely agree with her that there is a great need for a sovereign cloud service or services. Indeed, the inability to guarantee that data on the cloud is held in this country is a real issue that has not yet been properly addressed.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank the noble Baroness, Lady Kidron, for moving this amendment. As she rightly identified, the UK has a number of publicly held data assets, many of which contain extremely valuable information. This data—I flag, by way of an example, NHS data specifically—could be extremely valuable to certain organisations, such as pharmaceutical companies.

We are drawn to the idea of licensing such data—indeed, we believe that we could charge an extremely good price—but we have a number of concerns. Most notably, what additional safeguards would be required, given its sensitivity? What would be the limits and extent of the licensing agreement? Would this status close off other routes to monetising the data? Would other public sector bodies be able to use the data for free? Can this not already be done without the amendment?

Although His Majesty’s Official Opposition of course recognise the wish to ensure that the UK taxpayer gets a fair return on our information assets held by public bodies and arm’s-length organisations, and we certainly agree that we need to look at licensing, we are not yet sure that this amendment is either necessary or sufficient. We once again thank the noble Baroness, Lady Kidron, for moving it. We look forward to hearing both her and the Minister’s thoughts on the matter.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to introduce this group of amendments. I have a 35-minute speech prepared. In moving Amendment 211B, I shall speak also to Amendments 211C to 211E. The reason for this group of amendments is to try to get an increased focus on the range of issues they touch on.

I turn to Amendment 211B first. It seems at least curious to have a data Bill without talking about data centres in terms of their power usage, their environmental impact and the Government’s view of the current PUE standard. Is it of a standard that they think gives the right measure of confidence to consumers and citizens across the country, in terms of how data centres are being operated and their impacts?

Similarly, on Amendment 211C, not enough consideration is given to supply chains. I am not suggesting that they are the most exciting subject but you have to go only one or two steps back in any supply chain to get into deep depths of opacity. With this amendment, I am seeking to gain more clarity on data supply chains and the role of data across all supply chains. Through the combination of data and AI, we could potentially enable a transformation of our supply chain in real time. That would give us so much more flexibility to try for economic benefits and environmental benefits. I look forward to the Minister’s response.

I now move on to Amendment 211D. It is always a pleasure to bring AI into a Bill that really does not want to have AI in it. I am interested in the whole question of data input and output, not least with large language models. I am also interested in the Government’s view on how this interacts with the 1988 copyright Act. There may be some mileage in looking into some standards and approaches in this area, which would potentially go some way towards conditions of market access. We have some excellent examples to look at in other sectors of our economy and society, as set out in the amendment; I would welcome the Minister’s views on that.

I am happy that this group ends with Amendment 211E on the subject of public trust. In many ways, it is the golden thread that should run through everything when we talk about data; I wanted it to be the golden thread that ran through my AI regulation Bill. I always say that Clause 6 is the most important clause in that Bill because it goes to the question of public engagement and trust. Without that level of public engagement and trust, it does not matter how good the technologies are, how good the frameworks are or how good the chat around the data is. It might be golden but, if the public do not believe in it, they are not going to come and be part of it. The most likely consequence of this is that they will not be able to avail themselves of the benefits but they will almost certainly be saddled with the burdens. What these technologies enable is nothing short of a transformation of that discourse between citizen and state, with the potential to reimagine completely the social contract for the benefit of all.

Public engagement and public trust are the golden thread and the fuel for how we gain those economic, social and psychological benefits from the data. I will be very interested in the Minister’s response on what more could be done by the Government, because previous consultations, not least around some of these technologies, have been somewhat short of what we could achieve. With that #brevity and #our data, I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I shall be #even shorter. Data centres and their energy consumption are important issues. I agree that at a suitable moment—probably not now—it would be very interesting to hear the Government’s views on that. Reports from UK parliamentary committees and the Government have consistently emphasised the critical importance of maintaining public trust in data use and AI, but sometimes, the actions of the Government seem to go contrary to that. I support the noble Lord, Lord Holmes, in his call for essentially realising the benefits of AI while making sure that we maintain public trust.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Holmes of Richmond for tabling this amendment. As we all appreciate, taking stock of the effects of legislation is critical, as it allows us to see what has worked and what has not. Amendment 221B would require the Secretary of State to launch a consultation into the implications of the provisions of the Bill on the power usage and energy efficiency of data centres. His Majesty’s Official Opposition have no objection to the amendment’s aims but we wonder to what extent it is actually possible. By what means or benchmark can we identify whether a spike in energy usage is specifically due to a provision from this legislation, rather than as a result of some other factor? I should be most grateful if my noble friend could provide further detail on this matter in his closing speech.

Regarding Amendment 211C, we understand that much could be learned from a review of all data regulations and standards pertaining to the supply chains for financial, trade, and legal documents and products, although we wonder if this needs to happen the moment this Bill passes. Could this review not happen at any stage? By all means, let us do it sooner rather than later, but is it necessary to set a date in statute?

Moving on to Amendment 221D, we should certainly look to regulate the AI large language model sector to ensure that there are standards for the input and output of data for LLMs. However, this must be done in a way that does not stifle growth in this emerging industry.

Finally, we have some concerns about Amendment 211E. A national consultation on the use of individuals’ data is perhaps just too broad.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, listening to the noble Lord, Lord Lucas, is often an education, and today is no exception. I had no idea what local environmental records centres were, so I shall be very interested to hear what the Minister has to say in response.

Earl of Effingham Portrait The Earl of Effingham (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lord Lucas for tabling Amendment 211F and all noble Lords for their brief contributions to this group.

Amendment 211F ensures that all the biodiversity data collected by or in connection with government is collected in local environment records centres to ensure that records are as good as possible. That data is then used by or in connection with government, so it is put to the best possible use.

The importance of sufficient and high-quality record collection cannot and must not be understated. With this in mind, His Majesty’s Official Opposition support the sentiment of the amendment in my noble friend’s name. These Benches will always champion matters related to biodiversity and nature recovery. In fact, many of my noble friends have raised concerns about biodiversity in Committee debates in your Lordships’ House on the Crown Estate Bill, the Water (Special Measures) Bill and the Great British Energy Bill. Indeed, they have tabled amendments that ensure that matters related to biodiversity appear at the forefront of draft legislation.

With that in mind, I am grateful to my noble friend Lord Lucas for introducing provisions, via Amendment 211F, which would require any planning application involving biodiversity net gain to include a data search report from the relevant local environmental records centre. I trust that the Minister has listened to the concerns raised collaboratively in the debate on this brief group. We must recognise the importance of good data collection and ensure that such data is used in the best possible way.

Debate on Amendment 87 resumed.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in carrying on on this group, I will speak to the question that Clause 78 stands part, and to Amendments 107, 109, 125, 154, 155 and 156, but to start I support Amendment 87 in the name of the noble and learned Lord, Lord Thomas of Cwmgiedd. We had a masterclass from him last Tuesday and he made an extremely good case for that amendment, which is very elegant.

The previous Government deleted the EU Charter of Fundamental Rights from the statute book through the Retained EU Law (Revocation and Reform) Act 2023, and this Bill does nothing to restore it. Although references in the UK GDPR to fundamental rights and freedoms are now to be read as references to the ECHR as implemented through the Human Rights Act 1998, the Government’s ECHR memorandum states:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.


As the noble and learned Lord mentioned, this could leave a significant gap in protection for individuals whose data is processed by private organisations and will mean lower data protection rights in the UK compared with the EU, so these Benches strongly support his Amendment 87, which would apply the convention to private bodies where personal data is concerned. I am afraid we do not support Amendments 91 and 97 from the noble Viscount, Lord Camrose, which seem to hanker after the mercifully defunct DPDI.

We strongly support Amendments 139 and 140 from the noble Baroness, Lady Kidron. Data communities are one of the important omissions from the Bill. Where are the provisions that should be there to support data-sharing communities and initiatives such as Solid? We have been talking about data trusts and data communities since as long ago as the Hall-Pesenti review. Indeed, it is interesting that the Minister herself only this April said in Grand Committee:

“This seems to be an area in which the ICO could take a lead in clarifying rights and set standards”.


Indeed, she put forward an amendment:

“Our Amendment 154 would therefore set a deadline for the ICO to do that work and for those rights to be enacted. The noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, made a good case for broadening these rights in the Bill and, on that basis, I hope the Minister will agree to follow this up, and follow up his letter so that we can make further progress on this issue”.—[Official Report, 17/4/24; col. GC 322.]


I very much hope that, now the tables are turned, so to speak, the Minister will take that forward herself in government.

Amendments 154, 155 and 156 deal with the removal of the principle of the supremacy of EU law. They are designed to undo the lowering of the standard of data protection rights in the UK brought about by the REUL Act 2023. The amendments would apply the protections required in Article 23.2 of the UK GDPR to all the relevant exceptions in Schedules 2 to 4 to the Data Protection Act 2018. This is important because data adequacy will be lost if the standard of protection of personal data in the UK is no longer essentially equivalent to that in the EU.

The EU’s adequacy decision stated that it did not apply in the area of immigration and referred to the case of Open Rights Group v the Secretary of State for the Home Department in the Court of Appeal. This case was brought after the UK left the EU, but before the REULA came into effect. The case is an example of how the preservation of the principle of the supremacy of EU law continued to guarantee high data protection standards in the UK, before this principle was deleted from the statute book by the REULA. In broad terms, the Court of Appeal found that the immigration exception in Schedule 2 to the Data Protection Act 2018 conflicted with the safeguards in Article 23 of the UK GDPR. This was because the immigration exemption was drafted too broadly and failed to incorporate the safeguards prescribed for exemptions under Article 23.2 of the UK GDPR. It was therefore held to be unlawful and was disapplied.

The Home Office redrafted the exemption to make it more protective, but it took several attempts to bring forward legislation which provided sufficient safeguards for data subjects. The extent of the safeguards now set out in the immigration exemption underscores both what is needed for compatibility with Article 23.2 of the UK GDPR and the deficiencies in the rest of the Schedule 2 exemptions. It is clear when reading the judgment in the Open Rights case that the majority of the exemptions from data subject rights under Schedule 2 to the Data Protection Act fail to meet the standards set out in Article 23.2 to the UK GDPR. The deletion of the principle of the supremacy of EU law has removed the possibility of another Open Rights-style challenge to the other exemptions in Schedule 2 to the Data Protection Act 2018. I hope that, ahead of the data adequacy discussions with the Commission, the Government’s lawyers have had a good look at the amendments that I have tabled, drafted by a former MoJ lawyer.

The new clause after Clause 107 in Amendment 154 applies new protections to the immigration exemption to the whole of Schedule 2 to the DPA 2018, with the exception of the exemptions that apply in the context of journalism or research, statistics and archiving. Unlike the other exemptions, they already contain detailed safeguards.

Amendment 155 is a new clause extending new protections which apply to the immigration exemption to Schedule 3 to the DPA 2018, and Amendment 156 is another new clause applying new protections which apply to the immigration exemption to Schedule 2 to the DPA 2018.

As regards Amendment 107, the Government need to clarify how data processing under recognised legitimate interests are compatible with conditions for data processing under existing lawful bases, including the special categories of personal data under Articles 5 and 9 of the UK GDPR. The Bill lowers the standard of the protection of personal data where data controllers only have to provide personal data based on

“a reasonable and proportionate search”.

The lack of clarity on what reasonable and proportionate mean in the context of data subject requests creates legal uncertainty for data controllers and organisations, specifically regarding whether the data subject’s consideration on the matter needs to be accounted for when responding to requests. This is a probing amendment which requires the Secretary of State to explain why the existing lawful bases for data processing are inadequate for the processing of personal data when additional recognised legitimate interests are introduced. It requires the Secretary of State to publish guidance within six months of the Act’s passing to clarify what constitutes reasonable and proportionate protections of personal data.

Amendment 109 would insert a new clause, to ensure that data controllers assess the risk of collective and societal harms,

“including to equality and the environment”,

when carrying out data protection impact assessments. It requires them to consult affected people and communities while carrying out these assessments to improve their quality, and requires data controllers to publish their assessments to facilitate informed decision-making by data subjects and to enable data controllers to be held accountable.

Turning to whether Clause 78 should stand part, on top of Clause 77, Clause 78 would reduce the scope of transparency obligations and rights. Many AI systems are designed in a way that makes it difficult to retrieve personal data once ingested, or understand how this data is being used. This is not principally due to technical limitations but the decision of AI developers who do not prioritise transparency and explainability.

As regards Amendment 125, it is clear that there are still further major changes proposed to the GDPR on police duties, automated decision-making and recognised legitimate interests which continue to make retention of data adequacy for the purposes of digital trade with the EU of the utmost priority in considering those changes. During the passage of the Data Protection and Digital Information Bill, I tabled an amendment to require the Government to publish an assessment of the impact of the Bill on EU/UK data adequacy within six months of the Act passing; I have tabled a similar amendment, with one change, to this Bill. As the next reassessment of data adequacy is set for June 2025, a six-month timescale may prove inconsequential to the overall adequacy decision. We must therefore recommend stipulating that this assessment takes place before this reassessment.

Baroness Jones of Whitchurch Portrait The Parliamentary Under-Secretary of State, Department for Business and Trade and Department for Science, Information and Technology (Baroness Jones of Whitchurch) (Lab)
- Hansard - - - Excerpts

My Lords, I thank all noble Lords for their consideration of these clauses. First, I will address Amendment 87 tabled by the noble and learned Lord, Lord Thomas, and the noble and learned Lord—sorry, the noble Lord—Lord Clement-Jones.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I will take any compliment.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We should take them while we can. Like the noble Lord, Lord Clement-Jones, I agree that the noble and learned Lord, Lord Thomas, made an excellent contribution. I appreciate this is a particularly technical area of legislation, but I hope I can reassure both noble Lords that the UK’s data protection law gives effect to convention rights and is designed to protect them. The Human Rights Act requires legislation to be interpreted compatibly with convention rights, whether processing is carried out by public or private bodies. ECHR rights are therefore a pervasive aspect of the rules that apply to public and private controllers alike. The noble and learned Lord is right that individuals generally cannot bring claims against private bodies for breaches of convention rights, but I reassure him that they can bring a claim for breaching the data protection laws giving effect to those rights.

I turn to Amendment 91, tabled by the noble Viscount, Lord Camrose, Amendment 107, tabled by the noble Lord, Lord Clement-Jones, and the question of whether Clause 78 should stand part, which all relate to data subject requests. The Government believe that transparency and the right of access is crucial. That is why they will not support a change to the language around the threshold for data subject requests, as this will undermine data subjects’ rights. Neither will the Bill change the current expectations placed on controllers. The Bill reflects the EU principle of proportionality, which has always underpinned this legislation, as well as existing domestic case law and current ICO guidance. I hope that reassures noble Lords.

Amendments 97 and 99, tabled by the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, relate to the notification exemption in Article 14 of the UK GDPR. I reassure noble Lords that the proportionality test provides an important safeguard for the existing exemption when data is collected from sources other than the data subject. The controller must always consider the impact on data subjects’ rights of not notifying. They cannot rely on the disproportionate effort exemption just because of how much data they are processing—even when there are many data subjects involved, such as there would be with web scraping. Moreover, a lawful basis is required to reuse personal data: a web scraper would still need to pass the balancing test to use the legitimate interest ground, as is usually the case.

The ICO’s recent outcomes report, published on 12 December, specifically referenced the process of web scraping. The report outlined:

“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test”.

--- Later in debate ---
Given the above reassurances, I hope noble Lords will agree not to press their amendments in this group.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The Minister said there is a power to amend, but she has not said whether she thinks that would be desirable. Is the power to be used only if we are found not to be data-adequate because the immigration exemption does not apply across the board? That is, will the power be used only if we are forced to use it?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I reassure the noble Lord that, as he knows, we are very hopeful that we will have data adequacy so that issue will not arise. I will write to him to set out in more detail when those powers would be used.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I have co-signed Amendment 137. I do not need to repeat the arguments that have already been made by those who have spoken before me on it; they were well made, as usual. Again, it seems to expose a gap in where the Government are coming from in this area of activity, which should be at the forefront of all that they do but does not appear to be so.

As has just been said, this may be as simple as putting in an initial clause right up at the front of the Bill. Of course, that reminds me of the battle royal we had with the then Online Safety Bill in trying to get up front anything that made more sense of the Bill. It was another beast that was difficult to ingest, let alone understand, when we came to make amendments and bring forward discussions about it.

My frustration is that we are again talking about stuff that should have been well inside the thinking of those responsible for drafting the Bill. I do not understand why a lot of what has been said today has not already appeared in the planning for the Bill, and I do not think we will get very far by sending amendments back and forward that say the same thing again and again: we will only get the response that this is all dealt with and we should not be so trivial about it. Could we please have a meeting where we get around the table and try and hammer out exactly what it is that we see as deficient in the Bill, to set out very clearly for Ministers where we have red lines—that will make it very easy for them to understand whether they are going to meet them or not—and do it quickly?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.

I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.

In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.

We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.

This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.

I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.

Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.

I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.

Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of

“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”

without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.

The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.

I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.

Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.

Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.

Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.

This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.

Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.

This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.

My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?

Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:

“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,


based on analysis of 13.1 million donors by the Salocin Group. The letter continues:

“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.


I hope that the Government will listen to the DMA and the charities involved.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.

Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.

Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.

Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.

On Amendment 102, again, when it comes to providing information to them,

“the damage and distress to the data subjects”

is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.

--- Later in debate ---
Amendment 159A from the noble Viscount, Lord Camrose, is aimed at enabling cookie paywalls. As we have identified, conversely, Amendment 159 from the noble Lord, Lord Clement-Jones, seeks to ban their use. Generally, these paywalls work by giving web users the option to pay for a cookie-free browsing experience. Earlier this year the Information Commissioner launched a call for views on “consent or pay” models for cookies. The aim of the Information Commissioner’s call for views is to provide the online advertising industry with clarity on how advertising cookies and paywalls can be used in compliance with data protection and privacy laws. We will consider the Information Commissioner’s findings when he publishes his response to this call for views. It would be premature to make legal changes without considering the findings or consulting interested parties. I hope noble Lords will bear that in mind.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

When does the Minister anticipate that the ICO will produce that report?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I do not have the detail of all that. Obviously, the call for views has only recently gone out and he will need time for consideration of the responses. I hope the noble Lord will accept that the ICO is on the case on this matter. If we can provide more information, we will.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

May I ask the Minister a hypothetical question? If the ICO believes that these are not desirable, what instruments are there for changing the law? Can the ICO, under its own steam, so to speak, ban them; do we need to do it in primary legislation; or can it be done in secondary legislation? If the Minister cannot answer now, perhaps she can write to me.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Of course I will write to the noble Lord. It will be within the ICO’s normal powers to make changes where he finds that they are necessary.

I move to Amendment 160, tabled by noble Lord, Lord Lucas, which seeks to create a new exemption for advertising performance cookies. There is a balance to strike between driving growth in the advertising, news and publishing sectors while ensuring that people retain choice and control over how their data is used. To exempt advertising measurement cookies, we would need to assess how intrusive these cookies are, including what they track and where data is sent. We have taken a delegated power so that exemptions to the prohibition can be added in future once evidence supports it, and we can devise appropriate safeguards to minimise privacy risks. In the meantime, we have been actively engaging with the advertising and publishing sectors on this issue and will continue to work with them to consider the potential use of the regulation-making power. I hope that the noble Lord will accept that this is work in progress.

Amendment 161, also from the noble Lord, Lord Lucas, aims to extend the soft opt-in rule under the privacy and electronic communications regulations to providers of auto-enrolment pension schemes. The soft opt-in rule removes the need for some commercial organisations to seek consent for direct marketing messages where there is an existing relationship between the organisation and the customer, provided the recipient did not object to receiving direct marketing messages when their contact details were collected.

The Government recognise that people auto-enrolled by their employers in workplace pension schemes may not have an existing relationship with their pension provider, so I understand the noble Lord’s motivations for this amendment. However, pension providers have opportunities to ask people to express their direct mail preferences, such as when the customer logs on to their account online. We are taking steps to improve the support available for pension holders through the joint Government and FCA advice guidance boundary review. The FCA will be seeking feedback on any interactions of proposals with direct marketing rules through that consultation process. Again, I hope the noble Lord will accept that this issue is under active consideration.

Amendment 162, tabled by the noble Lord, Lord Clement-Jones, would create an equivalent provision to the soft opt-in but for charities. It would enable a person to send electronic marketing without permission to people who have previously expressed an interest in their charitable objectives. The noble Lord will recall, and has done so, that the DPDI Bill included a provision similar to his amendment. The Government removed it from that Bill due to the concerns that it would increase direct marketing from political parties. I think we all accepted at the time that we did not want that to happen.

As the noble Lord said, his amendment is narrower because it focuses on communications for charitable purposes, but it could still increase the number of messages received by people who have previously expressed an interest in the work of charities. We are listening carefully to arguments for change in this area and will consider the points he raises, but I ask that he withdraws his amendment while we consider its potential impact further. We are happy to have further discussions on that.

--- Later in debate ---
Moved by
108: Clause 79, page 93, line 18, leave out “court” and insert “tribunal”
Member’s explanatory statement
This amendment is consequential on the new Clause (Transfer of jurisdiction of courts to tribunals).
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in moving Amendment 108, I will also speak to all the other amendments in this group. They are all designed to transfer all existing provisions from the courts to the tribunals and simplify the enforcement of data rights. Is that not something to be desired? This is not just a procedural change but a necessary reform to ensure that the rights granted on paper translate into enforceable rights in reality.

The motivation for these amendments stems from recurring issues highlighted in cases such as Killock and Veale v the Information Commissioner, and Delo v the Information Commissioner. These cases revealed a troubling scenario where the commissioner presented contradictory positions across different levels of the judiciary, exacerbating the confusion and undermining the credibility of the regulatory framework governing data protection. In these cases, the courts have consistently pointed out the confusing division of jurisdiction between different courts and tribunals, which not only complicates the legal process but wastes considerable public resources. As it stands, individuals often face the daunting task of determining the correct legal venue for their claims, a challenge that has proved insurmountable for many, leading to denied justice and unenforced rights.

By transferring all data protection provisions from the courts to more specialised tribunals, which are better equipped to handle such cases, and clarifying the right-to-appeal decisions made by the commissioner, these amendments seek to eliminate unnecessary legal barriers. Many individuals, often representing themselves and lacking legal expertise, face the daunting challenge of navigating complex legal landscapes, deterred by high legal costs and the intricate determination of appropriate venues for their claims. This shift will not only reduce the financial burden on individuals but enhance the efficiency and effectiveness of the judicial process concerning data protection. By simplifying the legal landscape, we can safeguard individual rights more effectively and foster a more trustworthy digital environment.

--- Later in debate ---
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.

The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.

As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.

On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.

Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for his response to my amendments and welcome him to the Dispatch Box and a whole world of pain on the Data (Use and Access) Bill, as he has, no doubt, noted already after just two hours’ worth of this Committee.

I found his response disappointing, and I think both he and the noble Viscount, Lord Camrose, have misunderstood the nature of this situation. This is not a blend, which is all beautifully logical depending on the nature of the case. This is an absolute mishmash where the ordinary litigant is faced with great confusion, not knowing quite often whether to go to the court or a tribunal, where the judges themselves have criticised the confusion and where there appears to be no appetite, for some reason, in government for a review of the jurisdictions.

I felt that the noble Viscount was probably reading from his previous ministerial brief. Perhaps he looked back at Hansard for what he said on the DPDI Bill. It certainly sounded like that. The idea that the courts are peerless in their legal interpretation and the poor old tribunals really just do not know what they are doing is wrong. They are expert tribunals, you can appear before them in person and there are no fees. It is far easier to access a tribunal than a court and certainly, as far as appeals are concerned, the idea that the ordinary punter is going to take judicial review proceedings, which seems to be the implication of staying with the current system on appeals if the merits of the ICO’s decisions are to examined, seems quite breathtaking. I know from legal practice that JR is not cheap. Appearing before a tribunal and using that as an appeal mechanism would seem far preferable.

I will keep on pressing this because it seems to me that at the very least the Government need to examine the situation to have a look at what the real objections are to the jurisdictional confusion and the impact on data subjects who wish to challenge decisions. In the meantime, I beg leave to withdraw the amendment.

Amendment 108 withdrawn.
--- Later in debate ---
Moved by
110: Clause 80, page 94, line 24, at end insert—
“3. To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”Member's explanatory statement
This amendment would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I beg to move Amendment 110 and will speak to Amendments 112, 114, 120, 121, 122, 123 and Clause 80 stand part. As we have heard, artificial intelligence and algorithmic and automated decision-making tools, are increasingly being used across the public sector to make and support many of the highest impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life.



The Committee will be pleased to hear that I will not repeat the contents of my speech on my Private Member’s Bill on this subject last Friday. But the fact remains that the rapid adoption of AI in the public sector presents significant risks and challenges, including: the potential for unfairness, discrimination and misuse, as demonstrated by scandals such as the UK’s Horizon and Australia’s Robodebt cases; automated decisions that are prone to serious error; lack of transparency and accountability in automated decision-making processes; privacy and data protection concerns; algorithmic bias; and the need for human oversight.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we have had a really profound and significant debate on these issues; it has been really helpful that they have been aired by a number of noble Lords in a compelling and articulate way. I thank everybody for their contributions.

I have to say at the outset that the Government want data protection rules fit for the age of emerging technologies. The noble Lord, Lord Holmes, asked whether we are addressing issues of the past or issues of the future. We believe that the balance we have in this Bill is exactly about addressing the issues of the future. Our reforms will reduce barriers to the responsible use of automation while clarifying that organisations must provide stringent safeguards for individuals.

I stress again how seriously we take these issues. A number of examples have been quoted as the debate has gone on. I say to those noble Lords that examples were given where there was no human involved. That is precisely what the new provisions in this Bill attempt to address, in order to make sure that there is meaningful human involvement and people’s futures are not being decided by an automated machine.

Amendment 110 tabled by the noble Lords, Lord Clement-Jones and Lord Knight, seeks to clarify that, for human involvement to be meaningful, it must be carried out by a competent person. Our reforms make clear that solely automated decisions lack meaningful human involvement. That goes beyond a tick-box exercise. The ICO guidance also clarifies that

“the human involvement has to be active and not just a token gesture”;

that right is absolutely underpinned by the wording of the regulations here.

I turn next to Amendment 111. I can assure—

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I was listening very carefully. Does “underpinned by the regulations” mean that it will be underpinned?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes. The provisions in this Bill cover exactly that concern.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

The issue of meaningful human involvement is absolutely crucial. Is the Minister saying that regulations issued by the Secretary of State will define “meaningful human involvement”, or is she saying that it is already in the primary legislation, which is not my impression?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Sorry—it is probably my choice of language. I am saying that it is already in the Bill; it is not intended to be separate. I was talking about whether solely automated decisions lack meaningful human involvement. This provision is already set out in the Bill; that is the whole purpose of it.

On Amendment 111, I assure the noble Viscount, Lord Camrose, that controllers using solely automated processing are required to comply with the data protection principles. I know that he was anticipating this answer, but we believe that it captures the principles he proposes and achieves the same intended effect as his amendment. I agree with the noble Viscount that data protection is not the only lens through which AI should be regulated, and that we cannot address all AI risks through the data protection legislation, but the data protection principles are the right ones for solely automated decision-making, given its place in the data protection framework. I hope that that answers his concerns.

On Amendment 112, which seeks to prohibit solely automated decisions that contravene the Equality Act 2010, I assure the noble Lords, Lord Clement-Jones and Lord Knight, that the data protection framework is clear that controllers must adhere to the Equality Act.

Amendments 113 and 114 would extend solely automated decision-making safeguards to predominantly automated decision-making. I assure the noble and learned Lord Thomas, the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, that the safeguards in Clause 80 are designed to protect individuals where meaningful human involvement is lacking. Predominantly automated decision-making will already include meaningful human involvement and therefore does not require these additional safeguards.

On Amendments 114A and 115A, tabled by the noble Viscount, Lord Camrose, many noble Lords have spoken in our debates about the importance of future-proofing the legislation. These powers are an example of that: without them, the Government will not have the ability to act quickly to update protections for individuals in the light of rapid technology developments.

I assure noble Lords that the regulation powers are subject to a number of safeguards. The Secretary of State must consult the Information Commissioner and have regard to other relevant factors, which can include the impact on individuals’ rights and freedoms as well as the specific needs and rights of children. As with all regulations, the exercise of these powers must be rational; they cannot be used irrationally or arbitrarily. Furthermore, the regulations will be subject to the affirmative procedure and so must be approved by both Houses of Parliament.

I assure the noble Lord, Lord Clement-Jones, that one of the powers means that his Amendment 123 is not necessary, as it can be used to describe specifically what is or is not meaningful human involvement.

Amendment 115A, tabled by the noble Viscount, Lord Camrose, would remove the reforms to Parts 3 and 4 of the Data Protection Act, thereby putting them out of alignment with the UK GDPR. That would cause confusion and ambiguity for data subjects.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I am sorry to interrupt again as we go along but, a sentence or so ago, the Minister said that the definition in Amendment 123 of meaningful human involvement in automated decision-making was unnecessary. The amendment is designed to change matters. It would not be the Secretary of State who determined the meaning of meaningful human involvement; in essence, it would be initiated by the Information Commissioner, in consultation with the Secretary of State. So I do not quite understand why the Minister used “unnecessary”. It may be an alternative that is undesirable, but I do not understand why she has come to the conclusion that it is unnecessary. I thought it was easier to challenge the points as we go along rather than at the very end.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, we would say that a definition in the Bill is not necessary because it is dealt with case by case and is supplemented by these powers. The Secretary of State does not define meaningful human involvement; it is best done case by case, supported by the ICO guidance. I hope that that addresses the noble Lord’s point.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

That is slightly splitting hairs. The noble Viscount, Lord Camrose, might want to comment because he wanted to delete the wording that says:

“The Secretary of State may by regulations provide that … there is, or is not, to be taken to be meaningful human involvement”.


He certainly will determine—or is able to determine, at least—whether or not there is human involvement. Surely, as part of that, there will need to be consideration of what human involvement is.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

Will the Minister reflect on the issues around a case-by-case basis? If I were running an organisation of any sort and decided I wanted to use ADM, how would I make a judgment about what is meaningful human involvement on a case-by-case basis? It implies that I would have to hope that my judgment was okay because I have not had clarity from anywhere else and in retrospect, someone might come after me if I got that judgment wrong. I am not sure that works, so will she reflect on that at some point?

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I am happy to write.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her very detailed and careful response to all the amendments. Clearly, from the number of speakers in this debate, this is one of the most important areas of the Bill and one that has given one of the greatest degrees of concern, both inside and outside the Committee. I think the general feeling is that there is still concern. The Minister is quite clear that the Government are taking these issues seriously, in terms of ADM itself and the impact in the workplace, but there are missing parts here. If you add all the amendments together—no doubt we will read Hansard and, in a sense, tick off the areas where we have been given an assurance about the interpretation of the Bill—there are still great gaps.

It was very interesting to hear what the noble Lord, Lord Kamall, had to say about how the computer said “no” as he reached the gate. A lot of this is about communications. I would be very interested if any letter to the noble Lord, Lord Lucas, was copied more broadly, because that is clearly one of the key issues. It was reassuring to hear that the ICO will be on top of this in terms of definitions, guidance, audit and so on, and that we are imminently to get the publication of the records of algorithmic systems in use under the terms of the algorithmic transparency recording standard.

We have had some extremely well-made points from the noble Viscounts, Lord Colville and Lord Camrose, the noble Lords, Lord Lucas, Lord Knight and Lord Holmes, and the noble Baroness, Lady Kidron. I am not going to unpack all of them, but we clearly need to take this further and chew it over before we get to Report. I very much hope that the Minister will regard a will write letter on stilts as required before we go very much further, because I do not think we will be purely satisfied by this debate.

The one area where I would disagree is on treating solely automated decision-making as the pure subject of the Clause 80 rights. Looking at it in the converse, it is perfectly proper to regard something that does not have meaningful human involvement as predominantly automated decision-making. I do not think, in the words of the noble Viscount, Lord Camrose, that this does muddy the waters. We need to be clearer about what we regard as being automated decision-making for the purpose of this clause.

There is still quite a lot of work to do in chewing over the Minister’s words. In the meantime, I beg leave to withdraw my amendment.

Amendment 110 withdrawn.
--- Later in debate ---
This seeks to retain the requirement for police forces to record the reason they are accessing data from a police database.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, a key aspect of data protection rests in how it restricts the use of personal data once it has been collected. The public need confidence that their data will be used for the reasons they had shared it and not further used in ways that breach their legitimate expectations—or they will become suspicious as regards providing their data. The underlying theme that we heard on the previous group was the danger of losing public trust, which very much applies in the area of law enforcement and national security.

However, Schedules 4 and 5 would remove the requirement to consider the legitimate expectations of the individuals whose data is being processed, or the impact that this would have on their rights, for the purposes of national security, crime detection and prevention, safeguarding or answering to a request by a public authority. Data used for the purposes listed in these schedules would not need to undergo either a balancing test under Article 6.1(f) or a compatibility test under Article 6.4 of the UK GDPR. The combined effect of these provisions would be to authorise almost unconditional data sharing for law enforcement and other public security purposes while, at the same time, reducing accountability and traceability over how the police use the information being shared with them.

As with the previous DPDI Bill, Clauses 87 to 89 of this Bill grant the Home Secretary and police powers to view and use people’s personal data through the use of national security certificates and designation notices, which are substantially the same as Clauses 28 to 30 of the previous DPDI Bill. This risks further eroding trust in law enforcement authorities. Accountability for access to data for law enforcement purposes should not be lowered, and data sharing should be underpinned by a robust test to ensure that individuals’ rights and expectations are not disproportionately impacted. It is a bafflement as to why the Government are so slavishly following their predecessor and believe that these new and unaccountable powers are necessary.

By opposing that Clause 81 stand part, I seek to retain the requirement for police forces to record the reason they are accessing data from a police database. The public need more, not less, transparency and accountability over how, why and when police staff and officers access and use records about them. Just recently, the Met Police admitted that they investigated more than 100 staff over the inappropriate accessing of information in relation to Sarah Everard. This shows that the police can and do act to access information inappropriately, and there may well be less prominent cases where police abuse their power by accessing information without worry for the consequences.

Regarding Amendments 126, 128 and 129, Rights and Security International has repeatedly argued that the Bill would violate the UK’s obligations under the European Convention on Human Rights. On Amendment 126, the requirements in the EU law enforcement directive for logging are, principally, to capture in all cases the justification for personal data being examined, copied, amended or disclosed when it is processed for a law enforcement process—the objective is clearly to ensure that data is processed only for a legitimate purpose—and, secondarily, to identify when, how and by whom the data has been accessed or disclosed. This ensures that individual accountability is captured and recorded.

Law enforcement systems in use in the UK typically capture some of the latter information in logs, but very rarely do they capture the former. Nor, I am informed, do many commodity IT solutions on the market capture why data was accessed or amended by default. For this reason, a long period of time was allowed under the law enforcement directive to modify legacy systems installed before May 2016, which, in the UK, included services such as the police national computer and the police national database, along with many others at a force level. This transitional relief extended to 6 May 2023, but UK law enforcement did not, in general, make the required changes. Nor, it seems, did it ensure that all IT systems procured after 6 May 2016 included a strict requirement for LED-aligned logging. By adopting and using commodity and hyperscaler cloud services, it has exacerbated this problem.

In early April 2023, the Data Protection Act 2018 (Transitional Provision) Regulations 2023 were laid before Parliament. These regulations had the effect of unilaterally extending the transitional relief period under the law enforcement directive for the UK from May 2023 to May 2026. The Government now wish to strike the requirement to capture the justification for any access to data completely, on the basis that this would free up to 1.5 million hours a year of valuable police time for our officers so that they can focus on tackling crime on our streets, rather than being bogged down by administration, and that this would save approximately £42.8 million per year in taxpayers’ money.

This is a serious legislative issue on two counts: it removes important evidence that may identify whether a person was acting with malicious intent when accessing data, as well as removing any deterrent effect of them having to do so; and it directly deviates from a core part of the law enforcement directive and will clearly have an impact on UK data adequacy. The application of effective control over access to data is very much a live issue in policing, and changing the logging requirement in this way does nothing to improve police data management. Rather, it excuses and perpetuates bad practice. Nor does it increase public confidence.

Clause 87(7) introduces new Section 78A into the Act. This lays down a number of exemptions and exclusions from Part 3 of that Act when the processing is deemed to be in the interests of national security. These exemptions are wide ranging, and include the ability to suspend or ignore principles 2 through 6 in Part 3, and thus run directly contrary to the provisions and expectations of the EU law enforcement directive. Ignoring those principles in itself also negates many of the controls and clauses in Part 3 in its entirety. As a result, they will almost certainly result in the immediate loss of EU law-enforcement adequacy.

I welcome the ministerial letter from the noble Lord, Lord Hanson of Flint, to the noble Lord, Lord Anderson, of 6 November, but was he really saying that all the national security exemption clause does is bring the 2018 Act into conformity with the GDPR? I very much hope that the Minister will set out for the record whether that is really the case and whether it is really necessary to safeguard national security. Although it is, of course, appropriate and necessary for the UK to protect its national security interests, it is imperative that balance remains to protect the rights of a data subject. These proposals do not, as far as we can see, strike that balance.

Clause 88 introduces the ability of law enforcement, competent authorities and intelligence agencies to act as joint controllers in some circumstances. If Clause 88 and associated clauses go forward to become law, they will almost certainly again result in withdrawal of UK law enforcement adequacy and will quite likely impact on the TCA itself.

Amendment 127 is designed to bring attention to the fact that there are systemic issues with UK law enforcement’s new use of hyperscaler cloud service providers to process personal data. These issues stem from the fact that service providers’ standard contracts and terms of service fail to meet the requirements of Part 3 of the UK’s Data Protection Act 2018 and the EU law enforcement directive. UK law enforcement agencies are subject to stringent data protection laws, including Part 3 of the DPA and the GDPR. These laws dictate how personal data, including that of victims, witnesses, suspects and offenders, can be processed. Part 3 specifically addresses data transfers to third countries, with a presumption against such transfers unless strictly necessary. This contrasts with UK GDPR, which allows routine overseas data transfer with appropriate safeguards.

Cloud service providers routinely process data outside the UK and lack the necessary contractual guarantees and legal undertakings required by Part 3 of the DPA. As a result, their use for law enforcement data processing is, on the face of it, not lawful. This non-compliance creates significant financial exposure for the UK, including potential compensation claims from data subjects for distress or loss. The sheer volume of data processed by law enforcement, particularly body-worn video footage, exacerbates the financial risk. If only a small percentage of cases result in claims, the compensation burden could reach hundreds of millions of pounds annually. The Government’s attempts to change the law highlight the issue and suggest that past processing on cloud service providers has not been in conformity with the UK GDPR and the DPA.

The current effect of Section 73(4)(b) of the Data Protection Act is to restrict transfers for competent authorities who may have a legitimate operating need, and should possess the internal capability to assess that need, from making transfers to recipients who are not relevant authorities or international organisations and that cloud service provider. This amendment is designed to probe what impact removal of this restriction would have and whether it would enable them to do so where such a transfer is justified and necessary. I beg to move.

Baroness Morgan of Cotes Portrait Baroness Morgan of Cotes (Non-Afl)
- Hansard - - - Excerpts

My Lords, I will speak to Amendment 124. I am sorry that I was not able to speak on this issue at Second Reading. I am grateful to the noble and learned Lord, Lord Thomas of Cwmgiedd, for his support, and I am sorry that he has not been able to stay, due to a prior engagement.

Eagle-eyed Ministers and the Opposition Front Bench will recognise that this was originally tabled as an amendment to the Data Protection and Digital Information (No. 2) Bill. It is still supported by the Police Federation. I am grateful to the former Member of Parliament for Loughborough for originally raising this with me, and I thank the Police Federation for its assistance in briefing us in preparing this draft clause. The Police Federation understands that the Home Secretary is supportive of the objective of this amendment, so I shall listen with great interest to what the Minister has to say.

This is a discrete amendment designed to address an extremely burdensome and potentially unnecessary redaction exercise, in relation to a situation where the police are preparing a case file for submission to the Crown Prosecution Service for a charging decision. Given that this issue was talked about in the prior Bill, I do not intend to go into huge amounts of detail because we rehearsed the arguments there, but I hope very much that with the new Government there might be a willingness to entertain this as a change in the law.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.

Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.

As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.

Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.

I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.

Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.

Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.

Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.

I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.

Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

There were some raised eyebrows.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Yes, we could not see the noble Lord’s raised eyebrows.

Turning to Amendment 124, I thank the noble Baroness, Lady Morgan, for raising this important issue. While I obviously understand and welcome the intent, I do not think that the legislative change is what is required here. The Information Commissioner’s Office agrees that the Data Protection Act is not a barrier to the sharing of personal data between the police and the CPS. What is needed is a change in the operational processes in place between the police and the CPS that are causing this redaction burden that the noble Baroness spelled out so coherently.

We are very much aware that this is an issue and, as I think the noble Baroness knows, the Government are committed to reducing the burden on the police and the Home Office and to exploring with partners across the criminal justice system how this can best be achieved. We absolutely understand the point that the noble Baroness has raised, but I hope that she could agree to give space to the Home Office and the CPS to try to find a resolution so that we do not have the unnecessary burden of redaction when it is not necessary. It is an ongoing discussion—which I know the noble Baroness knows really—and I hope that she will not pursue it on that basis.

I will address Amendments 126 to 129 together. These amendments seek to remove parts of Schedule 8 to avoid divergence from EU legislation. The noble Lord, Lord Clement-Jones, proposes instead to remove existing parts of Section 73 of the Data Protection Act 2018. New Section 73(4)(aa), introduced by this Bill, with its bespoke path for personal data transfers from UK controllers to international processors, is crucial. In the modern age, where the use of such capabilities and the benefits they provide is increasing, we need to ensure that law enforcement can make effective use of them to tackle crime and keep citizens safe.

--- Later in debate ---
Considering all the explanations I have given, I hope that noble Lords will withdraw or not press their amendments.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for her response on this group, which was, again, very detailed. There is a lot to consider in what she had to say, particularly about the clauses beyond Clause 81. I am rather surprised that the current Government are still going down the same track on Clause 81. It is as if, because the risk of abuse is so high, this Government, like the previous one, have decided that it is not necessary to have the safeguard of putting down the justification in the first place. Yet we have heard about the Sarah Everard police officers. It seems to me perverse not to require justification. I will read further what the Minister had to say but it seems quite extraordinary to be taking away a safeguard at this time, especially when the Minister says that, at the same time, they need to produce logs of the time of the data being shared and so on. I cannot see what is to be gained—I certainly cannot see £42 million being saved. It is a very precise figure: £42.8 million. I wonder where the £800,000 comes from. It seems almost too precise to be credible.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I emphasise that we believe the safeguards are there. This is not a watering down of provisions. We are just making sure that the safeguards are more appropriate for the sort of abuse that we think might happen in future from police misusing their records. I do not want it left on the record that we do not think that is important.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

No. As I was saying, it seems that the Minister is saying that there will still be the necessity to log the fact that data has been shared. However, it seems extraordinary that, at the same time, it is not possible to say what the justification is. The justification could be all kinds of things, but it makes somebody think before they simply share the data. It seems to me that, given the clear evidence of abuse of data by police officers—data of the deceased, for heaven’s sake—we need to keep all the safeguards we currently have. That is a clear bone of contention.

I will read what else the Minister had to say about the other clauses in the group, which are rather more sensitive from the point of view of national security, data sharing abroad and so on.

Clause 81 agreed.
--- Later in debate ---
Moved by
134: Clause 90, page 113, leave out lines 1 to 5 and insert—
“(a) to monitor the application of GDPR, the applied GDPR and this Act, and ensure are fully enforced with all due diligence;(b) to act upon receiving a complaint, to investigate, to the extent appropriate, the subject matter of the complaint, and to take steps to clarify unsubstantiated issues before dismissing the complaint.”Member’s explanatory statement
This amendment removes the secondary objectives introduced by the Data Use and Access Bill, which frame innovation, competition, crime prevention and national security as competing objectives against the enforcement of data protection law.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, in moving Amendment 134—it is the lead amendment in this group—I shall speak to the others in my name and my Clause 92 stand part notice. Many of the amendments in this group stem from concerns that the new structure for the ICO will diminish its independence. The ICO is abolished in favour of the commission.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I thank noble Lords for their consideration of the issues before us in this group. I begin with Amendment 134 from the noble Lord, Lord Clement-Jones. I can confirm that the primary duty of the commissioner will be to uphold the principal objective: securing an appropriate level of data protection, carrying out the crucial balancing test between the interests of data subjects, controllers and wider public interests, and promoting public trust and confidence in the use of personal data.

The other duties sit below this objective and do not compete with it—they do not come at the expense of upholding data protection standards. The commissioner will have to consider these duties in his work but will have discretion as to their application. Moreover, the new objectives inserted by the amendment concerning monitoring, enforcement and complaints are already covered by legislation.

I thank the noble Lord, Lord Lucas for Amendment 135A. The amendment was a previous feature of the DPDI Bill but the Government decided that a statement of strategic priorities for the ICO in this Bill is not necessary. The Government will of course continue to set out their priorities in relation to data protection and other related areas and discuss them with the Information Commissioner as appropriate.

Amendment 142 from the noble Viscount, Lord Camrose, would remove the ICO’s ability to serve notices by email. We would argue that email is a fast, accessible and inexpensive method for issuing notices. I can reassure noble Lords that the ICO can serve a notice via email only if it is sent to an email address published by the recipient or where the ICO has reasonable grounds to believe that the notice will come to the attention of the person, significantly reducing the risk that emails may be missed or sent to the wrong address.

Regarding the noble Viscount’s Amendment 143, the assumption that an email notice will be received in 48 hours is reasonable and equivalent to the respective legislation of other regulators, such as the CMA and Ofcom.

I thank the noble Lord, Lord Clement-Jones, for Amendment 144 concerning the ICO’s use of reprimands. The regulator does not commonly issue multiple reprimands to the same organisation. But it is important that the ICO, as an independent regulator, has the discretion and flexibility in instances where there may be a legitimate need to issue multiple reprimands within a particular period without placing arbitrary limits on that.

Turning to Amendment 144A, the new requirements in Clause 101 will already lead to the publication of an annual report, which will include the regulator’s investigation and enforcement activity. Reporting will be categorised to ensure that where the detail of cases is not public, commercially sensitive investigations are not inadvertently shared. Splitting out reporting by country or locality would make it more difficult to protect sensitive data.

Turning to Amendment 145, with thanks to the noble Baroness, Lady Kidron, I agree with the importance of ensuring that the regulator can be held to account on this issue effectively. The new annual report in Clause 101 will cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. Clause 90 also requires the ICO to publish a strategy and report on how it has complied with its new statutory duties. Both of these will cover the new duty relating to children’s awareness and rights, and this should include the ICO’s activity to support and uphold its important age-appropriate design code.

I thank the noble Lord, Lord Clement-Jones, for Amendments 163 to 192 to Schedule 14, which establishes the governance structure of the information commission. The approach, including the responsibilities conferred on the Secretary of State, at the core of the amendments follows standard corporate governance best practice and reflects the Government’s commitment to safeguarding the independence of the regulator. This includes requiring the Secretary of State to consult the chair of the information commission before making appointments of non-executive members.

Amendments 165 and 167A would require members of the commission to be appointed to oversee specific tasks and to be from prescribed fields of expertise. Due to the commission’s broad regulatory remit, the Government consider that it would not be appropriate or helpful for the legislation to set out specific areas that should receive prominence over others. The Government are confident that the Bill will ensure that the commission has the right expertise on its board. Our approach safeguards the integrity and independence of the regulator, draws clearly on established precedent and provides appropriate oversight of its activities.

Finally, Clauses 91 and 92 were designed to ensure that the ICO’s statutory codes are consistent in their development, informed by relevant expertise and take account of their impact on those likely to be affected by them. They also ensure that codes required by the Secretary of State have the same legal effect as pre-existing codes published under the Data Protection Act.

Considering the explanations I have offered, I hope that the noble Lords, Lord Clement-Jones and Lord Lucas, the noble Viscount, Lord Camrose, and the noble Baroness, Lady Kidron, will agree not to press their amendments.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister for that response. If I speak for four minutes, that will just about fill the gap, but I hope to speak for less than that.

The Minister’s response was very helpful, particularly the way in which she put the clarification of objectives. Of course, this is shared with other regulators, where this new growth duty needs to be set in the context of the key priorities of the regulator. My earlier amendment reflected a nervousness about adding innovation and growth duties to a regulator, which may be seen to unbalance the key objectives of the regulator in the first place, but I will read carefully what the Minister said. I welcome the fact that, unlike in the DPDI Bill, there is no requirement for a statement of strategic priorities. That is why I did not support Amendment 135A.

It is somewhat ironic that, in discussing a digital Bill, the noble Viscount, Lord Camrose, decided to go completely analogue, but that is life. Maybe that is what happens to you after four and a half hours of the Committee.

I do not think the Minister covered the ground on the reprimands front. I will read carefully what she said about the annual report and the need for the ICO—or the commission, as it will be—to report on its actions. I hope, just by putting down these kinds of amendments on reprimands, that the ICO will take notice. I have been in correspondence with the ICO myself, as have a number of organisations. There is some dissatisfaction, particularly with companies such as Clearview, where it is felt that the ICO has not taken adequate action on scraping and building databases from the internet. We will see whether the ICO becomes more proactive in that respect. I was reassured, however, by what the Minister said about NED qualifications and the general objective on the independence of the regulator.

There is much to chew on in what the Minister said. In the meantime, I beg leave to withdraw my amendment.

Amendment 134 withdrawn.

Public Authority Algorithmic and Automated Decision-Making Systems Bill [HL]

Lord Clement-Jones Excerpts
Moved by
Lord Clement-Jones Portrait Lord Clement-Jones
- View Speech - Hansard - -

That the Bill be now read a second time.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I declare my AI interests as set out in the register. I thank Big Brother Watch, the Public Law Project and the Ada Lovelace Institute, which, each in their own way, have provided the evidence and underpinned my resolve to ensure that we regulate the adoption of algorithmic and AI tools in the public sector, which are increasingly being used across it to make and support many of the highest-impact decisions affecting individuals, families and communities across healthcare, welfare, education, policing, immigration and many other sensitive areas of an individual’s life. I also thank the Public Bill Office, the Library and other members of staff for all their assistance in bringing this Bill forward and communicating its intent and contents, and I thank all noble Lords who have taken the trouble to come to take part in this debate this afternoon.

The speed and volume of decision-making that new technologies will deliver is unprecedented. They have the potential to offer significant benefits, including improved efficiency and cost effectiveness in government operations, enhanced service delivery and resource allocation, better prediction and support for vulnerable people and increased transparency in public engagement. However, the rapid adoption of AI in the public sector also presents significant risks and challenges, with the potential for unfairness, discrimination and misuse through algorithmic bias and the need for human oversight, a lack of transparency and accountability in automated decision-making processes and privacy and data protection concerns.

Incidents such as the 2020 A-level and GCSE grading fiasco, where an algorithmic approach saw students, particularly those from lower-income areas, unfairly miss out on university places when an algorithm was used to estimate grades from exams that were cancelled because of Covid-19, have starkly illustrated the dangers of unchecked algorithmic systems in public administration disproportionately affecting those from lower-income backgrounds. That led to widespread public outcry and a loss of trust in government use of technology.

Big Brother Watch’s investigations have revealed that councils across the UK are conducting mass profiling and citizen scoring of welfare and social care recipients. Its report, entitled Poverty Panopticon [The Hidden Algorithms Shaping Britains Welfare State], uncovered alarming statistics. Some 540,000 benefits applicants are secretly assigned fraud risk scores by councils’ algorithms before accessing housing benefit or council tax support. Personal data from 1.6 million people living in social housing is processed by commercial algorithms to predict rent non-payers. Over 250,000 people’s data is processed by secretive automated tools to predict the likelihood of abuse, homelessness or unemployment.

Big Brother Watch criticises the nature of these algorithms, stating that most are secretive, unevidenced, incredibly invasive and likely discriminatory. It argues that these tools are being used without residents’ knowledge, effectively creating tools of automated suspicion. The organisation rightly expressed deep concern that these risk-scoring algorithms could be disadvantaging and discriminating against Britain’s poor. It warns of potential violations of privacy and equality rights, drawing parallels to controversial systems like the Metropolitan Police’s gangs matrix database, which was found to be operating unlawfully. From a series of freedom of information requests last June, Big Brother Watch found that a flawed DWP algorithm wrongly flagged 200,000 housing benefit claimants for possible fraud and error, which meant that thousands of UK households every month had their housing benefit claims unnecessarily investigated.

In August 2020, the Home Office agreed to stop using an algorithm to help sort visa applications after it was discovered that the algorithm contained entrenched racism and bias, and following a challenge from the Joint Council for the Welfare of Immigrants and the digital rights group Foxglove. The algorithm essentially created a three-tier system for immigration, with a speedy boarding lane for white people from the countries most favoured by the system. Privacy International has raised concerns about the Home Office's use of a current tool called Identify and Prioritise Immigration Cases—IPIC—which uses personal data, including biometric and criminal records to prioritise deportation cases, arguing that it lacks transparency and may encourage officials to accept recommended decisions without proper scrutiny.

Automated decision-making has been proven to lead to harms in privacy and equality contexts, such as in the Harm Assessment Risk Tool, which was used by Durham Police until 2021, and which predicted reoffending risks partly based on an individual’s postcode in order to inform charging decisions. All these cases illustrate how ADM can perpetuate discrimination. The Horizon saga illustrates how difficult it is to secure proper redress once the computer says no.

There is no doubt that our new Government are enthusiastic about the adoption of AI in the public sector. Both the DSIT Secretary of State and Feryal Clark, the AI Minister, are on the record about the adoption of AI in public services. They have ambitious plans to use AI and other technologies to transform public service delivery. Peter Kyle has said:

“We’re putting AI at the heart of the government’s agenda to boost growth and improve our public services”,


and

“bringing together digital, data and technology experts from across Government under one roof, my Department will drive forward the transformation of the state”.—[Official Report, Commons, 2/9/24; col. 89.]

Feryal Clarke has emphasised the Administration’s desire to “completely transform digital Government” with DSIT. As the Government continue to adopt AI technologies, it is crucial to balance the potential benefits with the need for responsible and ethical implementation to ensure fairness, transparency and public trust.

The Ada Lovelace Institute warns of the unintended consequences of AI in the public sector, including the risk of entrenching existing practices, instead of fostering innovation and systemic solutions. As it says, the safeguards around automated decision-making, which exist only in data protection law, are therefore more critical than ever in ensuring people understand when a significant decision about them is being automated, why that decision is made, and have routes to challenge it, or ask for it to be decided by a human.

Our citizens need greater, not less, protection, but rather than accepting the need for these, we see the Government following in the footsteps of their predecessor by watering down such rights as there are under GDPR Article 22 not to be subject to automated decision-making. We will, of course, be discussing these aspects of the Data (Use and Access) Bill in Committee next week.

ADM safeguards are critical to public trust in AI, but progress has been glacial. Take the Algorithmic Transparency Recording Standard, which was created in 2022 and is intended to offer a consistent framework for public bodies to publish details of the algorithms used in making these decisions. Six records were published at launch, and only three more seem to have been published since then. The previous Government announced earlier this year that the implementation of the Algorithmic Transparency Recording Standard will be mandatory for departments. Minister Clark in the new Government has said,

“multiple records are expected to be published soon”,

but when will this be consistent across government departments? What teeth do the Central Digital and Data Office and the Responsible Technology Adoption Unit, now both within DSIT, have to ensure the adoption of the standard, especially in view of the planned watering down of the Article 22 GDPR safeguards? Where is the promised repository for ATRS records? What about the other public services in local government too?

The Public Law Project, which maintains a register called Tracking Automated Government, believes that in October last year there were more than 55 examples of public ADM systems use. Where is the transparency on those? The fact is that the Government’s Algorithmic Transparency Recording Standard, while a step in the right direction, remains voluntary and lacks comprehensive adoption or indeed a compliance mechanism or opportunity for redress. The current regulatory landscape is clearly inadequate to address these challenges. Despite the existing guidance and framework, there is no legally enforceable obligation on public authorities to be transparent about their use of ADM and algorithmic systems, or to rigorously assess their impact.

To address these challenges, several measures are needed. We need to see the creation of and adherence to ethical guidelines and accountability mechanisms for AI implementation; a clear regulatory framework and standards for use in the public sector; increased transparency and explainability of the adoption and use of AI systems; investment in AI education; and workforce development for public sector employees. We also need to see the right of redress, with a strengthened right for the individuals to challenge automated decisions.

My Bill aims to establish a clear mandatory framework for the responsible use of algorithmic and automated decision-making systems in the public sector. It will help to prevent the embedding of bias and discrimination in administrative decision-making, protect individual rights and foster public trust in government use of new technologies.

I will not adumbrate all the elements of the Bill. In an era when AI and algorithmic systems are becoming increasingly central to government ambitions for greater productivity and public service delivery, this Bill, I hope noble Lords agree, is crucial to ensuring that the benefits of these technologies are realised while safeguarding democratic values and individual rights. By ensuring that ADM systems are used responsibly and ethically, the Bill facilitates their role in improving public service delivery, making government operations more efficient and responsive.

The Bill is not merely a response to past failures but a proactive measure to guide the future use of technology within government and empower our citizens in the face of these powerful new technologies. I hope that the House and the Government will agree that this is the way forward. I beg to move.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - -

My Lords, I thank the Minister for her response and all noble Lords who have taken part in this debate, which I thought was perfectly formed and very expert. I was interested in the fact that the noble Baroness, Lady Lane-Fox, has a role in the digital centre for government and in what she had to say about what might be desirable going forward, particularly in the areas of skills and procurement. The noble Baroness, Lady Freeman, said much the same, which indicates something to me.

By the way, I think the Minister has given new meaning to the word “reservations”. That was the most tactful speech I have heard for a long time. It is a dangerous confidence if the Government really think that the ATRS, combined with the watered-down ADM provisions in the GDPR, are going to be enough. They are going to reap the whirlwind if they are not careful, with public trust being eroded. We have seen what has happened in the NHS: unless you are absolutely on the case on this, you will see 3.3 million people opt out of sharing their data, as in the NHS. This is something live; it erupts without due warning.

The examples I gave show a pretty dangerous use of ADM systems. Big Brother Watch has gone into some detail on the particular models that I illustrated. If the Government think that the ATRS is adequate, alongside their watered-down GDPR provisions, then, as I said, they are heading for considerable problems.

As the noble Lord, Lord Knight, can see, if the Government have reservations about my limited Bill, they will have even more reservations about anything more broad.

I do not want to tread on the toes of the noble Lord, Lord Holmes, who I am sure will come back with another Bill at some stage, but I am very sympathetic to the need for algorithmic impact assessment, particularly in the workplace, as advocated by the Institute for the Future of Work. We may be inflicting more amendments on the Minister when the time comes in the ADM Bill.

This Bill is, as the noble Baroness, Lady Lane-Fox, mentioned, based on the Canadian experience. It is based on a Canadian directive that is now well under way and is perfectly practical.

The warning of the noble Lord, Lord Tarassenko, about the use of large language models, with their unpredictability and inability to produce the same result, was an object lesson in the need for proper understanding and training within the Civil Service in the future, and for the development of open source-type LLMs on the back of the existing large language models that are out there, to make sure that they are properly trained and tested as a sovereign capacity.

It is clear that I am not going to get a great deal further. I am worried that we are going to see a continuation, in the phrase used by my noble friend Lady Hamwee, of the culture of deference: the machine is going to continue saying no and our citizens will continue to be unable to challenge decisions in an effective way. That will lead to further trouble.

I thank the noble Viscount, Lord Camrose, for his in-principle support. If the Bill is to have a Committee stage, I look forward to debating some of the definitions. In the meantime, I commend the Bill to the House.

Bill read a second time and committed to a Committee of the Whole House.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support the amendments from the noble Viscount, Lord Colville, which I have signed, and will put forward my Amendments 64, 68, 69, 130 and 132 and my Clause 85 stand part debate.

This part of the GDPR is a core component of how data protection law functions. It makes sure that organisations use personal data only for the reason that it was collected. One of the exceptional circumstances is scientific research. Focus on the definitions and uses of data in research increased in the wake of the Covid-19 pandemic, when some came to the view that legal uncertainty and related risk aversion were a barrier to clinical research.

There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or very narrow distinctions between the original and new purpose. The Government’s position seems to be that the Bill will only clarify the law, incorporating recitals to the original GDPR in the legislation. While this may be the policy intention, the Bill must be read in the context of recent developments in artificial intelligence and the practice of AI developers.

The Government need to provide reassurance that the intention and impact of the research provisions are not to enable the reuse of personal data, as the noble Viscount said, scraped from the internet or collected by tech companies under legitimate interest for training AI. Large tech companies could abuse the provisions to legitimise mass data scraping of personal data from the internet or to collect via legitimate interest—for example, by a social media platform, about its users. This could be legally reused for training AI systems under the new provisions if developers can claim that it constitutes scientific research. That is why we very much support what the noble Viscount said.

In our view, the definition of scientific research adopted in the Bill is too broad and will permit abuse by commercial interests outside the policy intention. The Bill must recognise the reality that companies will likely position any AI development as “reasonably described as scientific”. Combined with the inclusion of commercial activities in the Bill, that opens the door to data reuse for any data-driven product development under the auspices that it represents scientific research, even where the relationship to real scientific progress is unclear or tenuous. That is not excluded in these provisions.

I turn to Amendments 64, 68, 69, 130 and 132 and the Clause 85 stand part debate. The definition of scientific research in proposed new paragraph 2 under Clause 67(1)(b) is drawn so broadly that most commercial development of digital products and services, particularly those involving machine learning, could ostensibly be claimed by controllers to be “reasonably described as scientific”. Amendment 64, taken together with those tabled by the noble Viscount that I have signed, would radically reduce the scope for misuse of data reuse provisions by ensuring that controllers cannot mix their commercial purposes with scientific research and that such research must be in the public interest and conducted in line with established academic practice for genuine scientific research, such as ethics approval.

Since the Data Protection Act was introduced in 2018, based on the 2016 GDPR, the education sector has seen enormous expansion of state and commercial data collection, partly normalised in the pandemic, of increased volume, sensitivity, intrusiveness and high risk. Children need particular care in view of the special environment of educational settings, where pupils and families are disempowered and have no choice over the products procured, which they are obliged to use for school administrative purposes, for learning in the classroom, for homework and for digital behavioural monitoring.

The implications of broadening the definition of research activities conducted within the state education sector include questions of the appropriateness of applying the same rules where children are in a compulsory environment without agency or routine practice for research ethics oversight, particularly if the definition is expanded to commercial activity.

Parental and family personal data is often inextricably linked to the data of a child in education, such as home address, heritable health conditions or young carer status. The Responsible Technology Adoption Unit within DSIT commissioned research in the Department for Education to understand how parents and pupils feel about the use of AI tools in education and found that, while parents and pupils did not expect to make specific decisions about AI optimisation, they did expect to be consulted on whether and by whom pupil work and data can be used. There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement.

--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.

Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order

“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.

This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.

Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:

“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.


By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.

Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.

With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.

I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.

This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.



One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.

I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, I feel we are getting slightly repetitive, but before I, too, repeat myself, I should like to say something that I did not get the chance to say the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, and others: I will write, we will meet—all the things that you have asked for, you can take it for granted that they will happen, because we want to get this right.

I say briefly to the noble Baroness: we are in danger of thinking that the only good research is health research. If you go to any university up and down the country, you find that the most fantastic research is taking place in the most obscure subjects, be it physics, mechanical engineering, fabrics or, as I mentioned earlier, quantum. A lot of great research is going on. We are in danger of thinking that life sciences are the only thing that we do well. We need to open our minds a bit to create the space for those original thinkers in other sectors.

--- Later in debate ---
In response to the noble Lord, Lord Holmes, and to other points on AI legislation, as per the King’s Speech, the Government are seeking to establish the appropriate legislation to place requirements on those working to develop the most powerful artificial intelligence models. The next steps on that will be announced in the usual way—so maybe not this side of Santa, as I was asked.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Can the Minister say whether this will be a Bill, a draft Bill or a consultation?

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

We will announce this in the usual way—in due course. I refer the noble Lord to the King’s Speech on that issue. I feel that noble Lords want more information, but they will just have to go with what I am able to say at the moment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps another aspect the Minister could speak to is whether this will be coming very shortly, shortly or imminently.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.

On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.

--- Later in debate ---
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I welcome the noble Viscount to the sceptics’ club because he has clearly had a damascene conversion. It may be that this goes too far. I am slightly concerned, like him, about the bureaucracy involved in this, which slightly gives the game away. It could be seen as a way of legitimising commercial research, whereas we want to make it absolutely certain that that research is for the public benefit, rather than imposing an ethical board on every single aspect of research which has any commercial content.

We keep coming back to this, but we seem to be degrouping all over the place. Even the Government Whips Office seems to have given up trying to give titles for each of the groups; they are just called “degrouped” nowadays, which I think is a sign of deep depression in that office. It does not tell us anything about what the different groups contain, for some reason. Anyway, it is good to see the noble Viscount, Lord Camrose, kicking the tyres on the definition of the research aspect.

Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.

As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.

Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.

Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.

--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I rise to move the amendment standing in my name and to speak to my other amendments in this group. I am grateful to the noble Baroness, Lady Kidron and the noble Lord, Lord Clement-Jones, for signing a number of those amendments, and I am also very grateful to Foxglove Legal and other bodies that have briefed me in preparation for this.

My amendments are in a separate group, and I make no apology for that because although some of these points have indeed been covered in other amendments, my focus is entirely on NHS patient data, partly because it is the subject of a wider debate going on elsewhere about whether value can be obtained for it to help finance the National Health Service and our health in future years. This changes the nature of the relationship between research and the data it is using, and I think it is important that we focus hard on this and get some of the points that have already been made into a form where we can get reasonable answers to the questions that it leaves.

If my amendments are accepted or agreed—a faint hope—they would make it clear beyond peradventure that the consent protections in the Bill apply to the processing of data for scientific research, that a consistent definition of consent is applied and that that consistent definition is the one with which researchers and the public are already familiar and can trust going forward.

The Minister said at the end of Second Reading, in response to concerns I and others raised about research data in general and NHS data in particular, that the provisions in this Bill

“do not alter the legal obligations that apply in relation to decisions about whether to share data”.—[Official Report, 19/11/24; col. 196.]

I accept that that may be the intention, and I have discussed this with officials, who make the same point very strongly. However, Clause 68 introduces a novel and, I suggest, significantly watered-down definition of consent in the case of scientific research. Clause 71 deploys this watered-down definition of consent to winnow down the “purpose limitation” where the processing is for the purposes of scientific research in the public interest. Taken together, this means that there has been a change in the legal obligations that apply to the need to obtain consent before data is shared.

Clause 68 amends the pivotal definition of consent in Article 4(11). Instead of consent requiring something express—freely given, specific, informed, and unambiguous through clear affirmative action—consent can now be imputed. A data subject’s consent is deemed to meet these strict requirements even when it does not, as long as the consent is given to the processing of personal data for the purposes of an area of scientific research; at the time the consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed; seeking consent in relation to the area of scientific research is consistent with generally recognised ethical standards relevant to the area of research; and, so far as the intended purposes of the processing allow, the data subject is given the opportunity to consent to processing for only part of the research. These all sound very laudable, but I believe they cut down the very strict existing standards of consent.

Proposed new paragraph 7, in Clause 68, then extends the application of this definition across the regulation:

“References in this Regulation to consent given for a specific purpose (however expressed) include consent described in paragraph 6.”


Thus, wherever you read “consent” in the regulation you can also have imputed consent as set out in proposed new paragraph 6 of Article 4. This means that “consent” within the meaning of proposed new paragraph 6(a)—i.e. the basis for lawful processing—can be imputed consent in the new way introduced by the Bill, so there is a new type of lawful basis for processing.

The Minister is entitled to disagree, of course; I expect him to say that when he comes to respond. I hope that, when he does, he will agree that we share a concern on the importance of giving researchers a clear framework, as it is this uncertainty about the legal framework that could inadvertently act as a barrier to the good research we all need. So my first argument today is that, as drafted, the Bill leaves too much room for different interpretations, which will lead to exactly the kind of uncertainty that the Minister—indeed, all of us—wish to avoid.

As we have heard already, as well as the risk of uncertainty among researchers, there is also the risk of distrust among the general public. The public rightly want and expect to have a say in what uses their data is put to. Past efforts to modernise how the NHS uses data, such as care.data, have been expensive failures, in part because they have failed to win the public’s trust. More than 3.3 million people have already opted out of NHS data sharing under the national data opt-out; that is nearly 8% of the adults who could have been part of surveys. We have talked about the value of our data and being the gold standard or gold attractor for researchers but, if we do not have all the people who could contribute, we are definitely devaluing and debasing that research. Although we want to respect people’s choice as to whether to participate, of course, this enormous vote against research reflects a pretty spectacular failure to win public trust—one that undermines the value and quality of the data, as I said.

So my second point is that watering down the rights of those whose data is held by the NHS will not put that data for research purposes on a sustainable, long-term footing. Surely, we want a different outcome this time. We cannot afford more opt-outs; we want people opting back in. I argue that this requires a different approach—one that wins the public’s trust and gains public consent. The Secretary of State for Health is correct to say that most of the public want to see the better use of health data to help the NHS and to improve the health of the nation. I agree, but he must accept that the figures show that the general public also have concerns about privacy and about private companies exploiting their data without them having a say in the matter. The way forward must be to build trust by genuinely addressing those concerns. There must not be even a whiff of watering down legal protections, so that those concerns can instead be turned into support.

This is also important because NHS healthcare includes some of the most intimate personal data. It cannot make sense for that data to have a lower standard of consent protection going forward if it is being used for research. Having a different definition of consent and a lower standard of consent will inevitably lead to confusion, uncertainty and mistrust. Taken together, these amendments seek to avoid uncertainty and distrust, as well as the risk of backlash, by making it abundantly clear that Article 4 GDPR consent protections apply despite the new wording introduced by this Bill. Further, these are the same protections that apply to other uses of data; they are identical to the protections already understood by researchers and by the public.

I turn now to a couple of the amendments in this group. Amendment 71 seeks to address the question of consent, but in a rather narrow way. I have argued that Clause 68 introduces a novel and significantly watered-down definition of consent in the case of scientific research; proposed new paragraph 7 deploys this watered-down definition to winnow down the purpose limitation. There are broader questions about the wisdom of this, which Amendments 70, 79 and 81 seek to address, but Amendment 71 focuses on the important case of NHS health data.

If the public are worried that their health data might be shared with private companies without their consent, we need an answer to that. We see from the large number of opt-outs that there is already a problem; we have also seen it recently in NHS England’s research on public attitudes to health data. This amendment would ensure that the Bill does not increase uncertainty or fuel patient distrust of plans for NHS data. It would help to build the trust that data-enabled transformation of the NHS requires.

The Government may well retort that they are not planning to share NHS patient data with commercial bodies without patient consent. That is fine, but it would be helpful if, when he comes to respond, the Minister could say that clearly and unambiguously at the Dispatch Box. However, I put it to him that, if he could accept these amendments, the law would in fact reflect that assurance and ensure that any future Government would need to come back to Parliament if they wanted to take a different approach.

It is becoming obvious that whether research is in the public interest will be the key issue that we need to resolve in this Bill, and Amendment 72 provides a proposal. The Bill makes welcome references to health research being in the public interest, but it does not explain how on earth we decide or how that requirement would actually bite. Who makes the assessment? Do we trust a rogue operator to make its own assessment of how its research is in the public interest? What would be examples of the kind of research that the Government expect this requirement to prevent? I look forward to hearing the answer to that, but perhaps it would be more helpful if the Minister responded in a letter. In the interim, this amendment seeks to introduce some procedural clarity about how research will be certified as being in the public interest. This would provide clarity and reassurance, and I commend it to the Minister.

Finally, Amendment 131 seeks to improve the appropriate safeguards that would apply to processing for research, archiving and scientific purposes, including a requirement that the data subject has given consent. This has already been touched on in another amendment, but it is a way of seeking to address the issues that Amendments 70, 79 and 81 are also trying to address. Perhaps the Government will continue to insist that this is addressing a non-existent problem because nothing in Clauses 69 or 71 waters down the consent or purpose limitation protections and therefore the safeguards themselves add nothing. However, as I have said, informed readers of the Bill are interpreting it differently, so spelling out this safeguard would add clarity and avoid uncertainty. Surely such clarity on such an important matter is worth a couple of lines of additional length in a 250-page Bill. If the Government are going to argue that our Amendment 131 adds something objectionable, let them explain what is objectionable about consent protections applying to data processing for these purposes. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I support Amendments 70 to 72, which I signed, in the name of the noble Lord, Lord Stevenson of Balmacara. I absolutely share his view about the impact of Clause 68 on the definition of consent and the potential and actual mistrust among the public about sharing of their data, particularly in the health service. It is highly significant that 3.3 million people have opted out of sharing their patient data.

I also very much share the noble Lord’s views about the need for public interest. In a sense, this takes us back to the discussion that we had on previous groups about whether we should add that in a broader sense so not purely for health data or whatever but for scientific research more broadly, as he specifies. I very much support what he had to say.

Broadly speaking, the common factor between my clause stand part and what he said is health data. Data subjects cannot make use of their data rights if they do not even know that their data is being processed. Clause 77 allows a controller reusing data under the auspices of scientific research to not notify a data subject in accordance with Article 13 and 14 rights if doing so

“is impossible or would involve a disproportionate effort”.

We on these Benches believe that Clause 77 should be removed from the Bill. The safeguards are easily circumvented. The newly articulated compatibility test in new Article 8A inserted by Clause 71 that specifies how related the new and existing purposes for data use need to be to permit reuse is essentially automatically passed if it is conducted

“for the purposes of scientific research or historical research”.

This makes it even more necessary for the definition of scientific research to be tightened to prevent abuse.

Currently, data controllers must provide individuals with information about the collection and use of their personal data. These transparency obligations generally do not require the controller to contact each data subject. Such obligations can usually be satisfied by providing privacy information using different techniques that can reach large numbers of individuals, such as relevant websites, social media, local newspapers and so on.

--- Later in debate ---
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.

In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.

The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.

I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.

We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.

--- Later in debate ---
Moved by
73: Clause 70, page 77, leave out lines 34 to 38
Member's explanatory statement
This amendment and another amendment in Lord Clement-Jones’s name to clause 70 omits paragraphs 70(2)(b)-(c), (4), (5) and (6) which make amendments to UK GDPR to define certain data processing activities as “recognised legitimate interests”.
--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I start with an apology, because almost every amendment in this group is one of mine and I am afraid I have quite a long speech to make about the different amendments, which include Amendments 73, 75, 76, 77, 78, 78A, 83, 84, 85, 86, 89 and 90, and stand part debates on Schedules 4, 5 and 7 and Clause 74. But I know that the Members of this Committee are made of strong stuff.

Clause 70 and Schedule 4 introduce a new ground of recognised legitimate interest, which in essence counts as a lawful basis for processing if it meets any of the descriptions in the new Annexe 1 to the UK GDPR, which is at Schedule 4 to the Bill—for example, processing necessary for the purposes of responding to an emergency or detecting crime. These have been taken from the previous Government’s Data Protection and Digital Information Bill. This is supposed to reduce the burden on data controllers and the cost of legal advice when they have to assess whether it is okay to use or share data or not. Crucially, while the new ground shares its name with “legitimate interest”, it does not require the controller to make any balancing test taking the data subject’s interests into account. It just needs to meet the grounds in the list. The Bill gives the Secretary of State powers to define additional recognised legitimate interests beyond those in Annexe 1—a power heavily criticised by the Delegated Powers and Regulatory Reform Committee’s report on the Bill.

Currently where a private body shares personal data with a public body in reliance on Article 6(1)(e) of the GDPR, it can rely on the condition that the processing is

“necessary for the performance of a task carried out in the public interest”.

New conditions in Annexe 1, as inserted by Schedule 4, would enable data sharing between the private and public sectors to occur without any reference to a public interest test. In the list of recognised legitimate interests, the most important is the ability of any public body to ask another controller, usually in the private sector, for the disclosure of personal data it needs to deliver its functions. This applies to all public bodies. The new recognised legitimate interest legal basis in Clause 70 and Schedule 4 should be dropped.

Stephen Cragg KC, giving his legal opinion on the DPDI Bill, which, as I mentioned, has the same provision, stated that this list of recognised legitimate interests

“has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned”.

The ICO has also flagged concerns about recognised legitimate interests. In its technical drafting comments on the Bill, it said:

“We think it would be helpful if the explanatory notes could explicitly state that, in all the proposed new recognised legitimate interests, an assessment of necessity involves consideration of the proportionality of the processing activity”.


An assessment of proportionality is precisely what the balancing test is there to achieve. Recognised legitimate interests undermine the fundamental rights and interests of individuals, including children, in specific circumstances.

When companies are processing data without consent, it is essential that they do the work to balance the interests of the people who are affected by that processing against their own interests. Removing recognised legitimate interests from the Bill will not stop organisations from sharing data with the public sector or using data to advance national security, detect crime or safeguard children and vulnerable people. The existing legitimate interest lawful basis is more than flexible enough for these purposes. It just requires controllers to consider and respect people’s rights as they do so.

During the scrutiny of recognised legitimate interests in the DPDI Bill—I am afraid to have to mention this—the noble Baroness, Lady Jones of Whitchurch, who is now leading on this Bill as the Minister, raised concerns about the broad nature of the objectives. She rightly said:

“There is no strong reason for needing that extra power, so, to push back a little on the Minister, why, specifically, is it felt necessary? If it were a public safety interest, or one of the other examples he gave, it seems to me that that would come under the existing list of public interests”.—[Official Report, 25/3/24; col. GC 106.]


She never spoke a truer word.

However, this Government have reintroduced the same extra power with no new articulation of any strong reason for needing it. The constraints placed on the Secretary of State are slightly higher in this Bill than they were in the DPDI Bill, as new paragraph (9), inserted by Clause 70(4), means that they able to add new recognised legitimate interests only if they consider processing the case to be necessary to safeguard an objective listed in UK GDPR Article 23(1)(c) to (j). However, this list includes catch-alls, such as

“other important objectives of general public interest”.

To give an example of what this power would allow, the DPDI Bill included a recognised legitimate interest relating to the ability of political parties to use data about citizens during election campaigns on the basis that democratic participation is an objective of general public interest. I am glad to say that this is no longer included. Another example is that a future Secretary of State could designate workplace productivity as a recognised legitimate interest—which, without a balancing test, would open the floodgates to intrusive workplace surveillance and unsustainable data-driven work intensification. That does not seem to be in line with the Government’s objectives.

Amendment 74 is rather more limited. Alongside the BMA, we are unclear about the extent of the impact of Clause 70 on the processing of health data. It is noted that the recognised legitimate interest avenue appears to be available only to data controllers that are not public authorities. Therefore, NHS organisations appear to be excluded. We would welcome confirmation that health data held by an NHS data controller is excluded from the scope of Clause 70 now and in the future, regardless of the lawful basis that is being relied on to process health data.

--- Later in debate ---
Baroness Jones of Whitchurch Portrait Baroness Jones of Whitchurch (Lab)
- Hansard - - - Excerpts

My Lords, when the noble Lord, Lord Clement-Jones, opened his speech he said that he hoped that noble Lords would be made of strong stuff while he worked his way through it. I have a similar request regarding my response: please bear with me. I will address these amendments slightly out of order to ensure that related issues are grouped together.

The Schedule 4 stand part notice, and Amendments 73 and 75, tabled by the noble Lord, Lord Clement-Jones, and supported by the noble Baroness, Lady Kidron, would remove the new lawful ground of “recognised legitimate interests” created by Clause 70 and Schedule 4 to the Bill. The aim of these provisions is to give data controllers greater confidence about processing personal data for specified and limited public interest objectives. Processing that is necessary and proportionate to achieve one of these objectives can take place without a person’s consent and without undertaking the legitimate interests balancing test. However, they would still have to comply with the wider requirements of data protection legislation, where relevant, ensuring that the data is processed in compliance with the other data protection principles.

I say in response to the point raised by the noble Lord, Lord Cameron, that the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively.

The activities listed include processing of data where necessary to prevent crime, safeguarding national security, protecting children or responding to emergencies. They also include situations where a public body requests that a non-public body share personal data with it to help deliver a public task that is sanctioned by law. In these circumstances, it is very important that data is shared without delay, and removal of these provisions from the Bill, as proposed by the amendment, could make that harder.

Amendment 74, tabled by noble Lord, Lord Scriven, would prevent health data being processed as part of this new lawful ground, but this could have some unwelcome effects. For example, the new lawful ground is designed to give controllers greater confidence about reporting safeguarding concerns, but if these concerns relate to a vulnerable person’s health, they would not be able to rely on the new lawful ground to process the data and would have to identify an alternative lawful ground.

On the point made by the noble Lord, Lord Clement-Jones, about which data controllers can rely on the new lawful ground, it would not be available to public bodies such as the NHS; it is aimed at non-public bodies.

I reassure noble Lords that there are still sufficient safeguards in the wider framework. Any processing that involves special category data, such as health data, would also need to comply with the conditions and safeguards in Article 9 of the UK GDPR and Schedule 1 to the Data Protection Act 2018.

Amendment 78A, tabled by the noble Lord, Lord Clement-Jones, would remove the new lawful ground for non-public bodies or individuals to disclose personal data at the request of public bodies, where necessary, to help those bodies deliver their public interest tasks without carrying out a legitimate interest balance test. We would argue that, without it, controllers may lack certainty about the correct lawful ground to rely on when responding to such requests.

Amendment 76, also tabled by the noble Lord, Lord Clement-Jones, would remove the powers of regulations in Clause 70 that would allow the Secretary of State to keep the list of recognised legitimate interests up to date. Alternatively, the noble Lord’s Amendment 78 would require the Secretary of State to publish a statement every time he added a new processing activity to the list, setting out its purpose, which controllers it was aimed at and for how long they can use it. I reassure the noble Lord that the Government have already taken steps to tighten up these powers since the previous Bill was considered by this House.

Any new processing activities added would now also have to serve

“important objectives of … public interest”

as described in Article 23.1 of the UK GDPR and, as before, new activities could be added to the list only following consultation with the ICO and other interested parties. The Secretary of State would also have to consider the impact of any changes on people’s rights and have regard to the specific needs of children. Although these powers are likely to be used sparingly, the Government think it important that they be retained. I reassure the Committee that we will be responding to the report from the Delegated Powers Committee within the usual timeframes and we welcome its scrutiny of the Bill.

The noble Lord’s Amendment 77 seeks to make it clear that organisations should also be able to rely on Article 6.1(f) to make transfers between separate businesses affiliated by contract. The list of activities mentioned in Clause 70 is intended to be illustrative only and is drawn from the recitals to the UK GDPR. This avoids providing a very lengthy list that might be viewed as prescriptive. Article 6.1(f) of the UK GDPR is flexible. The transmission of personal data between businesses affiliated by contract may constitute a legitimate interest, like many other commercial interests. It is for the controller to determine this on a case-by-case basis.

I will now address the group of amendments tabled by the noble Lord, Lord Clement-Jones, concerning the purpose limitation principle, specifically Amendments 83 to 86. This principle limits the ways that personal data collected for one purpose can be used for another, but Clause 71 aims to provide more clarity and certainty around how it operates, including how certain exemptions apply.

Amendment 84 seeks to clarify whether the first exemption in proposed new Annexe 2 to the UK GDPR would allow personal data to be reused for commercial purposes. The conditions for using this exemption are that the requesting controller has a public task or official authority laid down in law that meets a public interest objective in Article 23.1 of the UK GDPR. As a result, I and the Government are satisfied that these situations would be for limited public interest objectives only, as set out in law.

Amendments 85 and 86 seek to introduce greater transparency around the use of safeguarding exemptions in paragraph 8 of new Annexe 2. These conditions are drawn from the Care Act 2014 and replicated in the existing condition for sensitive data processing for safeguarding purposes in the Data Protection Act 2018. I can reassure the Committee that processing cannot occur if it does not meet these conditions, including if the vulnerability of the individual no longer exists. In addition, requiring that an assessment be made and given to the data subject before the processing begins could result in safeguarding delays and would defeat the purpose of this exemption.

Amendment 83 would remove the regulation-making powers associated with this clause so that new exceptions could not be added in future. I remind noble Lords that there is already a power to create exemptions from the purpose limitation principle in the DPA 2018. This Bill simply moves the existing exemptions to a new annexe to the UK GDPR. The power is strictly limited to the public objectives listed in Article 23.1 of the UK GDPR.

I now turn to the noble Lord’s Amendment 89, which seeks to set conditions under which pseudonymised data should be treated as personal data. This is not necessary as pseudonymised data already falls within the definition of personal data under Article 4.1 of the UK GDPR. This amendment also seeks to ensure that a determination by the ICO that data is personal data applies

“at all points in that processing”.

However, the moment at which data is or becomes personal should be a determination of fact based on its identifiability to a living individual.

I turn now to Clause 74 stand part, together with Amendment 90. Noble Lords are aware that special categories of data require additional protection. Article 9 of the UK GDPR sets out an exhaustive list of what is sensitive data and outlines processing conditions. Currently, this list cannot be amended without primary legislation, which may not always be available. This leaves the Government unable to respond swiftly when new types of sensitive data are identified, including as a result of emerging technologies. The powers in Clause 74 enable the Government to respond more quickly and add new special categories of data, tailor the conditions applicable to their use and add new definitions if necessary.

Finally, I turn to the amendment tabled by the noble Lord, Lord Clement-Jones, that would remove Schedule 7 from the Bill. This schedule contains measures to create a clearer and more outcomes-focused UK international data transfers regime. As part of these reforms, this schedule includes a power for the Secretary of State to recognise new transfer mechanisms for protecting international personal data transfers. Without this, the UK would be unable to respond swiftly to emerging developments and global trends in personal data transfers. In addition, the ICO will be consulted on any new mechanisms, and they will be subject to debate in Parliament under the affirmative resolution procedure.

I hope this helps explain the Government’s intention with these clauses and that the noble Lord will feel able to withdraw his amendment.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thank the Minister. She covered quite a lot of ground and all of us will have to read Hansard quite carefully. However, it is somewhat horrifying that, for a Bill of this size, we had about 30 seconds from the Minister on Schedule 7, which could have such a huge influence on our data adequacy when that is assessed next year. I do not think anybody has talked about international transfers at this point, least of all me in introducing these amendments. Even though it may appear that we are taking our time over this Bill, we are not fundamentally covering all its points. The importance of this Bill, which obviously escapes most Members of this House—there are just a few aficionados—is considerable and could have a far-reaching impact.

I still get Viscount Camrose vibes coming from the Minister.

None Portrait Noble Lords
- Hansard -

Oh!

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

Perhaps I should stay that this kind of enthusiasm clearly conquers all. I should thank a former Minister, the noble Lord, Lord Kamall, and I thank the noble Baroness, Lady Kidron, for her thoughtful speech, particularly in questioning the whole recognised legitimate interest issue, especially in relation to vulnerable individuals.

It all seems to be a need for speed, whether it is the Secretary of State who has to make snappy decisions or a data controller. We are going to conquer uncertainty. We have to keep bustling along. In a way, to hell with individual data rights; needs must. I feel somewhat Canute-like holding up the barrier of data that will be flowing across us. I feel quite uncomfortable with that. I think the DPRRC is likewise going to feel pretty cheesed off.

--- Later in debate ---
Moved by
82: Clause 71, page 81, line 14, at end at end insert—
“4A. Where the controller collected the personal data based on Article 6(1)(a) (data subject’s consent), processing for a new purpose is not compatible with the original purpose if—(a) the data subject is a child,(b) the processing is based on consent given or authorised by the holder of parental responsibility over the child,(c) the data subject is an adult to whom either (a) or (b) applied at the time of the consent collection, or(d) the data subject is a deceased child.”Member’s explanatory statement
This amendment seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.

Additional safeguards are required for the protection of children’s data. This amendment

“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.

The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.

For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.

There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.

Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.

The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:

“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.


As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.


During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.

The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.

Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.

This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.

First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.

In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.

Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.

Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

I thank the Minister for her response. I should say at the outset that, although I may have led the group, it is clear that the noble Baroness, Lady Kidron, leads the pack as far as this is concerned. I know that she wants me to say that the noble Baroness, Lady Harding, wished to say that she was extremely sorry not to be able to attend as she wanted to associate herself wholeheartedly with these amendments. She said, “It’s so disappointing still to be fighting for children’s data to have higher protection but it seems that that’s our lot!” I think she anticipated the response, sadly. I very much thank the noble Baroness, Lady Kidron, the noble Lords, Lord Russell and Lord Stevenson, and the noble Viscount, Lord Camrose, in particular for his thoughtful response to Amendment 196.

I was very interested in the intervention from the noble Lord, Lord Stevenson, and wrote down “Not invented here” to sum up the Government’s response to some of these amendments, which has been consistently underwhelming throughout the debates on the DPDI Bill and this Bill. They have brought out such things as “the unintended effects” and said, “We don’t want to interfere with the ICO”, and so on. This campaign will continue; it is really important. Obviously, we will read carefully what the Minister said but, given the troops behind me, I think the campaign will only get stronger.

The Minister did not really deal with the substance of Amendment 196, which was not just a cunning ploy to connect the Bill with the Online Safety Act; it was about current intentions on categorisation. There is considerable concern that the current category 1 is overconservative and that we are not covering the smaller, unsafe social media platforms. When we discussed the Online Safety Bill, both in the Joint Committee and in the debates on subsequent stages of the Bill, it was clear that this was about risk, not just size, and we wanted to cover those risky, smaller platforms as well. While I appreciate the Government’s strategic statement, which made it pretty clear, and without wishing to overly terrorise Ofcom, we should make our view on categorisation pretty clear, and the Government should do likewise.

This argument and debate will no doubt continue. In the meantime, I beg leave to withdraw my amendment.

Amendment 82 withdrawn.
--- Later in debate ---
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I start by speaking to two amendments tabled in my name.

Amendment 91 seeks to change

“the definition of request by data subjects to data controllers”

that can be declined or

“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.

I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.

Amendment 97 would ensure that

“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.

If a subject does not even know that their data is being held, they cannot enforce their data rights.

Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.

Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have

“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.

I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.

I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.

Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

That is a sensible suggestion.