Report (1st Day)
Relevant documents: 3rd Report from the Constitution Committee, 9th and 12th Reports from the Delegated Powers Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.
16:14
Report received.
Clause 1: Customer data and business data
Amendment 1
Moved by
1: Clause 1, page 3, line 11, at end insert—
“(5A) In subsection (2), references to information includes inferred data.”Member's explanatory statement
This amendment ensures that when traders are required to provide information relating to goods, services and digital content supplied or provided to the customer that includes information that has been created using AI to build a profile about them.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.

I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.

Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.

The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.

The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, this is clearly box-office material, as ever.

I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.

Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.

I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?

I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?

I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.

I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.

I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for his reassurance, particularly that we will have an opportunity for a consultation on exactly how the smart data scheme works. I look forward to such agreement throughout the afternoon. With that, I beg leave to withdraw my amendment.

Amendment 1 withdrawn.
Clause 2: Power to make provision in connection with customer data
Amendment 2
Moved by
2: Clause 2, page 4, line 1, after “to” insert “the customer's data rights or”
Member's explanatory statement
This amendment adds enacting data rights to the list of actions that the Secretary of State or the Treasury can enable an “authorised person” to take on behalf of customers. This would make it possible for customers to assign their data rights to a third party to activate on their behalf.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, in moving Amendment 2 I will speak to Amendments 3, 4, 25, 42 and 43, all of which are in my name and that of the noble Lord, Lord Clement-Jones. The very detailed arguments for Amendments 25, 42 and 43 were made during the DPDI Bill and can be found at col. GC 89 of vol. 837 of Hansard, and the subsequent arguments for their inclusion in this Bill were made in Committee at col. GC 454. For that reason, I do not propose to make them in full again. I simply say that these amendments for data communities represent a more ambitious and optimistic view of the Bill that would empower citizens to use data law to benefit those with common interests. The example I gave last time was of gig workers assigning their data rights to an expert third party to see whether they were being fairly compensated. That is not something that any individual data subject can easily do alone.

The new Amendments 2, 3 and 4 demonstrate how the concept of data communities might work in relation to the Government’s smart data scheme. Amendment 2 would add enacting data rights to the list of actions that the Secretary of State or the Treasury can enable an authorised person to take on behalf of customers. Amendment 3 requires the Secretary of State or the Treasury to include data communities in the list of those who would be able to activate rights, including data rights on a customer’s behalf. Amendment 4 provides a definition of “data communities”.

Data communities are a process by which one data holder can assign their rights for a given purpose to a community of people who agree with that purpose. I share the Government’s desire to empower consumers and to promote innovation, and these amendments would do just that. Allowing the sharing of data rights of individuals, as opposed to specific categories of data, would strengthen the existing proposal and provide economic and social benefit to the UK and its citizens, rather than imagining that the third party is always a commercial entity.

In response to these amendments in Committee, the then Minister said two things. The first was that the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. She also warmly said that something of this kind was being planned by government and invited me and other noble Lords to discuss this area further. I made it clear that I would like such a meeting, but it has only just been scheduled and is planned for next week, which clearly does not meet the needs of the House, since we are discussing this today. I would be grateful if the current Minister could undertake to bring something on this subject back at Third Reading if we are not reassured by what we hear at the meeting.

While the UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf, in the example I gave the Minister in Committee it took many years and a bespoke agreement between the ICO and Uber for the 300-plus drivers to combine their data. Under equivalent GDPR provisions in European law, it required a Court of Appeal judgment in Norway before Uber conceded that it was entitled to the data on the drivers’ behalf. A right that cannot be activated without legal action and years of effort is not a right fully given; the UK GDPR is not sufficient in these circumstances.

I want to stress that these amendments are not simply about contesting wrongs. Introducing the concept of data communities would facilitate innovation and promote fairness, which is surely an aim of the legislation.

16:30
Before I sit down, I wanted to acknowledge that the AI action plan recommends in many places making it easier for organisations, including commercial companies, to access datasets, but it is silent on how citizens might be able to access and share their data collectively. Instead, it appears to assume that data mining is something that will happen to them, rather than by them or on their behalf. Matt Clifford, its author, is an AI tech investor. While there is much on which to agree with him when it comes to skills or investment in infrastructure, the relentless tech sector viewpoint, rather than that of worker, creator, citizen or child, is a weakness in itself and a problem in its timing. Those of us who would most like to be supportive of the UK being a tech-enabled nation find the needs of our communities and fellow citizens unserved by this unbridled tech utopianism that both recent history and some of the sector’s greatest innovators would suggest is very unwise. I beg to move.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.

A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.

There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.

It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.

There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.

I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.

Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.

Lord Leong Portrait Lord in Waiting/Government Whip (Lord Leong) (Lab)
- Hansard - - - Excerpts

My Lords, I rise to speak to Amendments 2, 3, 4, 25, 42 and 43. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for these amendments on data communities, which were previously tabled in Committee, and for the new clauses linking these with the Bill’s clauses on smart data.

As my noble friend Lady Jones noted in Committee, the Government support giving individuals greater agency over their data. The Government are strongly supportive of a robust regime of data subject rights and believe strongly in the opportunity presented by data for innovation and economic growth. UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. Stakeholders have, however, said that there may be barriers to this in practice.

I reassure noble Lords that the Government are actively exploring how we can support data intermediaries while maintaining the highest data protection standards. It is our intention to publish a call for evidence in the coming weeks on the activities of data intermediaries and the exercise of data subject rights by third parties. This will enable us to ensure that the policy settings on this topic are right.

In the context of smart data specifically, Part 1 of the Bill does not limit who the regulations may allow customers to authorise. Bearing in mind the IT and security-related requirements inherent in smart data schemes, provisions on who a customer may authorise are best determined in the context of a specific scheme, when the regulations are made following appropriate consultation. I hope to provide some additional reassurance that exercise of the smart data powers is subject to data protection legislation and does not displace data rights under that legislation.

There will be appropriate consultation, including with the Information Commissioner’s Office, before smart data schemes are introduced. This year, the Department for Business and Trade will be publishing a strategy on future uses of these powers.

While the smart data schemes and digital verification services are initial examples of government action to facilitate data portability and innovative uses of data, my noble friend Lady Jones previously offered a meeting with officials and the noble Baroness, Lady Kidron, to discuss these proposals, which I know my officials have arranged for next week—as the noble Baroness indicated earlier. I hope she is therefore content to withdraw her amendment.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, may I ask whether there is a definition of “customer” and whether that includes a user in the broader sense, or means worker or any citizen? Is it a customer relationship?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - - - Excerpts

I thank the noble Lord for that request, and I am sure my officials would be willing to do that.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I do not intend to detain the House on this for very long, but I want to say that holding meetings after the discussion on Report is not adequate. “Certain rights” and “customer” are exactly the sort of terms that I am trying to address here. To the noble Viscount—and my noble friend—Lord Camrose, I say that it is not adequate, and we have an academic history going back a long way. I hope that the meeting next week is fruitful and that the Government’s enthusiasm for this benefits workers, citizens and customers. I beg leave to withdraw the amendment.

Amendment 2 withdrawn.
Clause 3: Customer data: supplementary
Amendments 3 and 4 not moved.
Clause 13: Financial assistance
Amendment 5 not moved.
Clause 28: DVS trust framework
Amendment 6
Moved by
6: Clause 28, page 30, line 28, at end insert—
“(2A) In preparing the DVS trust framework the Secretary of State must assess whether the public authorities listed in subsection (2B) reliably ascertain the personal data attributes that they collect, record and share.(2B) The public authorities are—(a) HM Passport Office;(b) Driver and Vehicle Licensing Agency;(c) General Register Office;(d) National Records Office;(e) General Register Office for Northern Ireland;(f) NHS Personal Demographics Service;(g) NHS Scotland;(h) NI Health Service Executive;(i) Home Office Online immigration status (eVisa);(j) Disclosure and Barring Service;(k) Disclosure Scotland;(l) Nidirect (AccessNI);(m) HM Revenue and Customs;(n) Welsh Revenue Authority;(o) Revenue Scotland.”Member's explanatory statement
This amendment is to ensure that there is oversight that the public authorities that provide core identity information via the information gateway provide accurate and reliable information.
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, in moving Amendment 6 in my name I will also to speak to Amendment 8. This section of the Bill deals with digital verification services, and the root word there is verify/veritas—truth. Digital verification input must be truthful for the digital system to work. It is fundamental.

One can find all sorts of workarounds for old analogue systems. They are very flexible. With digital, one has to be precise. Noble Lords may remember the case in November of baby Lilah from Sutton-in-Ashfield, who was registered at birth as male by accident, as she was clearly female. The family corrected this on the birth register by means of a marginal note. There is no provision in law to correct an error on a birth certificate other than a marginal note. That works in analogue—it is there on the certificate—but in digital these are separate fields. In the digital systems, her sex is recorded as male.

16:45
There will be another field called “marginal note”, which nobody will look at much and which you cannot rely on the AI systems to look at either. So it will become really difficult to handle systems where data is inexact and wrong. This particular area is one I hope the Government might find the space to clear up in the course of the rest of the Bill’s passage. We really need to be able to correct errors when they are there.
Amendment 6 is about verifying that the sources of information for verification are good. Amendment 8 is about giving institutions a duty to be accurate. Those things matter if you are dealing with a verification system that will become, for instance, a source of digital identity in pubs and clubs—most obviously for age, but also for sex in terms of using particular facilities. The systems will have drawn their data from the sort of institutions mentioned in Amendment 6 and will regard that data as accurate. It will be extremely difficult for someone in a club to argue with something that appears on a digital verification system. It will be important that that information is accurate.
The Passport Office has allowed people to replace their sex with self-identified gender on passports since the 1960s and, until recently, kept no central records of this. I declare an interest in that my late wife altered her date of birth on her passport in the days when you could do such things, but sex is rather more serious. Doing that officially—to have a passport that shows your sex as something that it is not and to then use that as the basis for a digital verification system—starts to corrupt the whole system. The system is becoming unreliable; it is really important that what gets in there is correct. If you are dealing with use of facilities in clubs and relying on the digital verification system to apply quite reasonably, say, a rule that you have to be female to use the female changing rooms, which are communal, then you need to have a system that is accurate. That means we must take care, as we go into this AI world, to make sure that the data sources we feed in are accurate.
We also have an aspect of this in nursing and domiciliary care, where many people will want intimate care to be provided by people of the same sex. That has always seemed a reasonable request and one that offence should not be taken at. It is quite properly part of a lot of people’s upbringing that they are careful about the way they are exposed in front of the opposite sex. This can apply to males and females. The base of this has to be that the data held on the sex of the workers involved is accurate. There have been several cases recently within the NHS where that has clearly not been the case. We are looking at a government transformation and moving to an AI world. That AI world will be intolerable if it is based on data that is not truthful.
Anyway, what are these organisations doing, knowingly recording untruthful data? How do they manage that under the GDPR? What rights do they have to hold data that they know to be wrong? It seems astonishing to me that this has grown up. In any event, given where we are going and given where this Government are taking us, although I share other noble Lords’ concerns and fears, I am none the less behind the wagon, pushing. There could be some interesting outcomes to AI and what it might offer but we have to get it right. If we allow it to become corrupted, it will not work; it will spread all sorts of inefficiencies and wrongs without us being able to correct it. To get it right at the beginning is important. Tech-enabled should mean truth-enabled. I beg to move.
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.

As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.

The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.

The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.

The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:

“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.


Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.

The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.

I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.

Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.

Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.

The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.

These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.

17:00
The lack in the framework of specific redress mechanisms and a dedicated regulator further underscores the need for parliamentary oversight to protect individuals’ rights and interests in this rapidly evolving digital landscape. In his letters to the chairs of those two committees, the Minister, the noble Lord, Lord Vallance, made similar arguments to those made by the noble Baroness, Lady Jones. By the way, we wish the noble Baroness well and appreciate the baton being picked up for this Bill by the noble Lord.
I hope that I have made the case and that the final paragraph the Minister put in his letter does not counteract what I have to say about the benefits of parliamentary approval. The Minister writes that he hopes his letter will
“provide some helpful context … the Government remains of the view that it does not require parliamentary scrutiny, because its primary role is in the conformity assessment space which sits outside of the Bill”.
If anything, the letter makes the arguments of the Constitution Committee and the DPRRC stronger.
I turn to Amendment 9. The rapid growth of digital services and the potential for misuse emphasise the need for a review of a digital identity offence, as proposed in this amendment. As I pointed out to both this Government and the last one, there is currently no specific crime of digital identity theft, despite various laws that address related offences such as fraud, using a false identity and unauthorised computer access. This gap in legislation leaves the public vulnerable to the harms of digital identity theft. Creating a new specific offence of digital identity theft would better protect those who use digital identity online, ensuring that they had the same protection as they do in the physical world. Existing laws do not adequately cover the nuances of digital identity theft, and a clear criminal offence would serve as a deterrent to malicious actors. As I argued in Committee, the Government should follow the recommendations of the committee chaired by the noble Baroness, Lady Morgan, which produced the 2022 report Fighting Fraud: Breaking the Chain, which concluded that a specific criminal offence for identity theft was necessary, or that identity theft should be considered a serious aggravating factor in cases of fraud.
In Committee, the noble Baroness, Lady Jones, said that existing legislation, such as the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018, already addressed the behaviour targeted by the amendment. She said that new offences were unnecessary and could lead to overcriminalisation. Defining every instance of verification as an identity document under the Identity Documents Act 2010 could create
“an unclear … and duplicative process for … prosecution”—[Official Report, 3/12/24; col. GC 382.]
was another of her arguments. However, while existing legislation might touch on aspects of digital identity theft, the Fraud Act 2006 does not explicitly address the unique challenges posed by digital identity theft. The lack of a specific offence creates ambiguity and could allow perpetrators to exploit loopholes. Creating a specific offence would provide clarity and demonstrate a commitment to tackling this growing offence. By conducting a thorough review, the Government could ensure that the legal framework effectively combated digital identity theft while promoting a secure and trustworthy digital environment for individuals and businesses.
As for the approach of these Benches to the amendments tabled by the noble Lords, Lord Lucas and Lord Arbuthnot, I have some sympathy for the desire for accuracy in the records covered by digital identity services, and I hope the Government will be able to give assurances about that. However, we do not wish to turn this into a battle and a culture war opportunity, so we will not be supporting the noble Lords if they push them to a vote.
Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I very much support the amendments from the noble Lords, Lord Lucas and Lord Arbuthnot, particularly Amendment 6, about accuracy. It has become apparent—and Committee stage was interesting—that there is a challenge with having gender and sex as interchangeable. The problem becomes physical, because you cannot avoid the fact that you will react differently medically to certain things according to the sex you were born and to your DNA.

That can be very dangerous in two cases. The first case is where drugs or cures are being administered by someone who thinks they are treating a patient of one sex but they are actually a different sex. That could kill someone, quite happily. The second case is if you are doing medical research and relying on something, but then find that half the research is invalid because a person is not actually that sex but have decided to choose another gender. Therefore, all the research on that person could be invalid. That could lead to cures being missed, other things being diagnosed as being all right, and a lot of dangers.

As a society, we have decided that it will be all right for people to change gender—let us say that, as I think it is probably the easiest way to describe it. I do not see any problem with that, but we need critical things to be kept on records that are clearly separate. Maybe we can make decisions in Parliament, or wherever, about what you are allowed to declare on identity documents such as a passport. We need to have two things: one is sex, which is immutable, and therefore can help with all the other things behind the scenes, including research and treatments; the other is gender, which can be what you wish to declare, and society accepts that you can declare yourself as being of another gender. I cannot see any way round that. I have had discussions with people about this, and as one who would have said that this is quite wrong and unnecessary, I was convinced by the end of those discussions that it was right. Keeping the two separate in our minds would solve a lot of problems. These two amendments are vital for that.

I agree in many ways with the points from the noble Lord, Lord Clement-Jones. Just allowing some of these changes to be made by the stroke of a pen—a bit like someone is doing across the Atlantic—without coming to Parliament, is perhaps unwise sometimes. The combined wisdom of Parliament, looking at things from a different point of view, and possibly with a more societal point of view than the people who are trying to make systems work on a governmental basis, can be sensible and would avoid other mistakes being made. I certainly support his amendments, but I disagree entirely with his last statement where he did not support the noble Lords, Lord Lucas and Lord Arbuthnot.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend Lord Lucas for introducing this group and for bringing these important and sometimes very difficult matters to the attention of the House. I will address the amendments slightly out of order, if I may.

For digital verification services to work, the information they have access to and use to verify documents must be accurate; this is, needless to say, critical to the success of the entire scheme. Therefore, it is highly sensible for Amendment 8 to require public authorities, when they disclose information via the information gateway, to ensure that it is accurate and reliable and that they can prove it. By the same measure, Amendment 6, which requires the Secretary of State to assess whether the public authorities listed are collecting accurate information, is equally sensible. These amendments as a pair will ensure the reliability of DVS services and encourage the industry to flourish.

I would like to consider the nature of accurate information, especially regarding an individual’s biological sex. It is possible for an individual to change their recorded sex on their driving licence or passport, for example, without going through the process of obtaining a gender recognition certificate. Indeed, a person can change the sex on their birth certificate if they obtain a GRC, but many would argue that changing some words on a document does not change the reality of a person’s genome, physical presentation and, in some cases, medical needs, meaning that the information recorded does not accurately relate to their sex. I urge the Minister to consider how best to navigate this situation, and to acknowledge that it is crucially important, as we have heard so persuasively from the noble Earl, Lord Errol, and my noble friends Lord Arbuthnot and Lord Lucas, that a person’s sex is recorded accurately to facilitate a fully functioning DVS system.

The DVS trust framework has the potential to rapidly transform the way identities and information are verified. It should standardise digital verification services, ensure reliability and build trust in the concept of a digital verification service. It could seriously improve existing, cumbersome methods of verifying information, saving companies, employers, employees, landlords and tenants time and money. Personally, I have high hopes of its potential to revolutionise the practices of recruitment. I certainly do not know many people who would say no to less admin. If noble Lords are minded to test the opinion of the House, we will certainly support them with respect to Amendments 6 and 8.

With the greatest respect to the noble Lord, Lord Clement-Jones, I think it is a mistake to regard this as part of some culture war struggle. As I understand it, this is about accuracy of data and the importance, for medical and other reasons, of maintaining accurate data.

All the benefits of DVS cannot be to the detriment of data privacy and data minimisation. Parliament is well-practised at balancing multiple competing concepts and doing so with due regard to public opinion. Therefore, Amendment 7 is indeed a sensible idea.

Finally, Amendment 9 would require the Secretary of State to review whether an offence of false use of identity documents created or verified by a DVS provider is needed. This is certainly worth consideration. I have no doubt that the Secretary of State will require DVS providers to take care that their services are not being used with criminal intent, and I am quite sure that DVS service providers do not want to facilitate crimes. However, the history of technology is surely one of high-minded purposes corrupted by cynical practices. Therefore, it seems prudent for the Secretary of State to conduct a review into whether creating this offence is necessary and, if it is, the best way that it can be laid out in law. I look forward to hearing the Minister’s comments on this and other matters.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.

I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.

However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.

17:15
Amendment 7 would require the DVS trust framework to be laid before Parliament. The trust framework is a very technical document that sets the rules that digital verification services can be certified against and requires the Secretary of State to consult when preparing, publishing or revising the trust framework following an annual review. These rules, now in their fourth non-statutory version on GOV.UK, draw on existing technical requirements, standards, best practice, guidance and legislation. Compliance with the rules is ensured by third-party independent auditors—the conformity assessment bodies—which certify digital verification services when they are compliant with the trust framework. Similar certification schemes exist across numerous industries for providing quality assurance.
Although the Secretary of State has powers in the Bill relating to the trust framework, the primary role of the framework in practice is to provide baseline rules against which digital verification services can be assessed by the conformity assessment bodies. That process takes place outside the Bill and relies on tried and trusted accreditation processes, overseen by the United Kingdom Accreditation Service. For these reasons, and for the reason that this is indeed a process that exists and works, the Government remain of the view that the trust framework does not require parliamentary scrutiny.
The rules in the framework are likely to act as a robust baseline for the independent conformity assessment process. Schemes such as this exist in many sectors, as I have said, and draw heavily on existing standards. The Secretary of State will have to undertake an annual review and consult the Information Commissioner and other appropriate stakeholders as part of that process. The trust framework’s development will be informed by industry and regulatory knowledge as the market evolves.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.

On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.

On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.

The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - - - Excerpts

I would be grateful if the Minister could confirm whether he accepts that, on some occasions, passports and drivers’ licences inaccurately reflect the sex of their holders.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important. I know from my background in scientific research that, to know what you are dealing with, data is the most important thing to get. Making sure that we have a system to get this clear will be part of what we are doing.

Amendment 6 would require the Secretary of State to assess which public authorities can reliably verify related facts about a person in the preparation of the trust framework. This exercise is out of scope of the trust framework, as the Good Practice Guide 45—a standard signposted in the trust framework—already provides guidance for assessing the reliability of authoritative information across a wide range of use cases covered by the trust framework. Furthermore, the public authorities mentioned are already subject to data protection legislation which requires personal data processed to be accurate and, where relevant, kept up to date.

Amendment 8 would require any information shared by public authorities to be clearly defined, accompanied by metadata and accurate. The Government already support and prioritise the accuracy of the data they store, and I indicated the ongoing work to make sure that this continues to be looked at and improved. This amendment could duplicate or potentially conflict with existing protections under data protection legislation and/or other legal obligations. I reassure noble Lords that the Government believe that ensuring the data they process is accurate is essential to deliver services that meet citizens’ needs and ensure accurate evaluation and research. The Central Digital and Data Office has already started work on developing data standards on key entities and their attributes to ensure that the way data is organised, stored and shared is consistent.

It is our belief that these matters are more appropriately considered together holistically, rather than by a piecemeal approach through diverse legislation such as this data Bill. As such, I would be grateful if noble Lords would consider withdrawing their amendments.

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I am very grateful to all noble Lords who have spoken on this. I actually rather liked the amendments of the noble Lord, Lord Clement-Jones—if I am allowed to reach across to him—but I think he is wrong to describe Amendments 6 and 8 as “culture war”. They are very much about AI and the fundamentals of digital. Self-ID is an attractive thought; I would very much like to self-identify as a life Peer at the moment.

None Portrait Noble Lords
- Hansard -

Oh!

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

However, the truth should come before personal feelings, particularly when looking at data and the fundamentals of society. I hope that the noble Lord will take parliamentary opportunities to bring the framework in front of Parliament when it appears. I agree with him that Parliament should take an interest in and look at this, and I hope we will be able to do that through a short debate at some stage—or that he will be able to, because I suspect that I shall not be here to do so. It is important that, where such fundamental rights and the need for understanding are involved, there is a high degree of openness. However expert the consideration the Government may give this through the mechanisms the Minister has described, I do not think they go far enough.

So far as my own amendments are concerned, I appreciate very much what the Minister has said. We are clearly coming from the same place, but we should not let the opportunity of this Bill drift. We should put down the marker here that this is an absolutely key part of getting data and government right. I therefore beg leave to test the opinion of the House.

17:25

Division 1

Ayes: 205

Noes: 159

Amendment 7
Moved by
7: Clause 28, page 31, line 22, at end insert—
“(11) The Secretary of State must lay the DVS trust framework before Parliament.”Member's explanatory statement
This amendment will ensure Parliamentary oversight of the rules with which digital verification service providers must comply.
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.

17:40

Division 2

Ayes: 87

Noes: 157

17:50
Clause 45: Power of public authority to disclose information to registered person
Amendment 8
Moved by
8: Clause 45, page 42, line 23, at end insert—
“(5A) A public authority must not disclose information about an individual under this section unless the information—(a) is clearly defined and accompanied by metadata, and(b) the public authority is able to attest that it—(i) was accurate at the time it was recorded, and(ii) has not been changed or tampered, or(c) the public authority is able to attest that it—(i) has been corrected through a lawfully made correction, and(ii) was accurate at the time of the correction.”Member’s explanatory statement
This amendment is to ensure that public authorities that disclose information via the information gateway provide accurate and reliable information and that if the information has been corrected it is the correct information that is provided.
Amendment 8 agreed.
Amendment 9 not moved.
Clause 56: National Underground Asset Register: England and Wales
Amendment 10
Moved by
10: Clause 56, page 52, line 13, leave out “undertaker’s” and insert “contractor’s”
Member’s explanatory statement
New section 106B(6) of the New Roads and Street Works Act 1991 (defence where certain people have taken reasonable care) refers to “the undertaker’s employees” twice. This amendment corrects that by replacing one of those references with a reference to “the contractor’s employees”.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.

Amendment 10 agreed.
Amendment 11
Moved by
11: Clause 56, page 53, line 17, at end insert—
“(2A) The Secretary of State must provide guidance to relevant stakeholders on cyber-security measures before they may receive information from NUAR.”Member's explanatory statement
This amendment will require the Secretary of State to provide guidance to relevant stakeholders on security measures before they receive information from NUAR.
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 11 and 13 in my name and that of my noble friend Lord Markham. The national underground asset register contains the details of all underground assets and apparatus in England, Wales and Northern Ireland, or at any rate it will do as it goes forward. This includes water pipes, electricity cables, internet cables and fibres—details of the critical infrastructure necessary to sustain the UK as we know it.

Needless to say, there are many hostile actors who, if they got their hands on this information, would or could use it to commit appalling acts of terror. I am mindful of and grateful for the Government’s assurances given in Committee that it is and will be subject to rigorous security measures. However, the weakest link in cyber defence is often third-party suppliers and other partners who do not recognise the same level of risk. We should take every possible measure to ensure that the vital data in NUAR is kept safe and shared only with stakeholders who have the necessary security provisions in place.

For this reason, I have tabled Amendment 11, which would require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures which should be in place before they receive information from NUAR. I do not believe this would place a great burden on government departments, as appropriate cybersecurity standards already exist. The key is to ensure that they are duly observed.

I cannot overstate the importance of keeping this information secure, but I doubt noble Lords need much convincing on that score. Given how frighteningly high the stakes are, I strongly urge the most proactive possible approach to cybersecurity, advising stakeholders and taking every possible step to keep us all safe.

Amendment 13, also tabled in my name, requires the Registrar-General to make provisions to ensure the cybersecurity of the newly digitised registers of births, still-births, and deaths. There are a great many benefits in moving from a paper-based register of births and deaths to a digitised version. People no longer have to make the trip to sign the register in person, saving time and simplifying the necessary admin at very busy or very difficult points in people’s lives. It also reduces the number of physical documents that need to be maintained and kept secure. However, in digitising vast quantities of personal, valuable information, we are making a larger attack surface which will appeal to malign actors looking to steal personal data.

I know we discussed this matter in Committee, when the noble Baroness the Minister made the point that this legislation is more about a digitisation drive, in that all records will now be digital rather than paper and digital. While I appreciate her summary, I am not sure it addresses my concerns about the security risks of shifting to a purely digital model. We present a large and tempting attack surface, and the absence of paper back-ups increases the value of digital information even more, as it is the only register. Of course, there are already security measures in place for the digital copies of these registers. I have no doubt we have back-ups and a range of other fallback opportunities. But the same argument applies.

Proactive cybersecurity provisions are required, taking into account the added value of these registers and the ever-evolving threat we face from cybercriminals. I will listen with great interest to the thoughts of other noble Lords and the Minister.

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, for these amendments. Clause 56 forms part of NUAR provisions. The security of NUAR remains of the utmost importance. Because of this, the Government have closely involved a wide range of security stakeholders in the development of NUAR, including the National Protective Security Authority and security teams from the asset owners themselves. Providing clear acceptable user and usage policies for any digital service is important. As such, we intend to establish clear guidance on the appropriate usage of NUAR, including what conditions end users must fulfil before gaining access to the service. This may include cybersecurity arrangements, as well as personal vetting. However, we do not feel it appropriate to include this in the Bill.

Care must be taken when disclosing platform-specific cybersecurity information, as this could provide bad actors with greater information to enable them to counter these measures, ultimately making NUAR less secure. Furthermore, regulations made in relation to access to information from NUAR would be subject to the affirmative procedure. As such, there will be future opportunities for relevant committees to consider in full these access arrangements, including, on an individual basis, any security impacts. I therefore reassure noble Lords that these measures will ensure that access to NUAR data is subject to appropriate safeguards.

18:00
Turning to Amendment 13, also tabled by the noble Viscount, the registration online system has been in place for births, stillbirths and deaths since 2009. The system is protected to Home Office security standards and employs a range of anti-cyberattack best practices through the deployment of advanced, fully managed firewalls and intrusion detection systems. The data is replicated to a secure cloud platform every 30 minutes and robust measures are in place to protect it. Articles 25 and 32 of the UK general data protection regulation impose duties on controllers of personal data to implement appropriate technical and organisational measures, including security measures. Therefore, legislation is already in place to ensure the security of the electronic registers. The robust security measures the Home Office has in place ensure that we are complying with these statutory obligations.
With those explanations, I hope that the noble Viscount will be content to withdraw Amendment 11.
Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank the Minister for his considered reply. It is clear that the Government and the department are taking the issue of security with all due seriousness. However, I remain concerned, particularly about the move to NUAR as a highly tempting attack service for malign actors. In light of this, I am minded to test the opinion of the House.

18:02

Division 3

Ayes: 186

Noes: 162

18:13
Clause 58: National Underground Asset Register: Northern Ireland
Amendment 12
Moved by
12: Clause 58, page 62, line 34, leave out “undertaker’s” and insert “contractor’s”
Member’s explanatory statement
New Article 45B(6) of the Street Works (Northern Ireland) Order 1995 (defence where certain people have taken reasonable care) refers to “the undertaker’s employees” twice. This amendment corrects that by replacing one of those references with a reference to “the contractor’s employees”.
Amendment 12 agreed.
Clause 61: Form in which registers of births and deaths are to be kept
Amendment 13 not moved.
Clause 67: Meaning of research and statistical purposes
Amendment 14
Moved by
14: Clause 67, page 75, line 10, after “scientific” insert “and that is conducted in the public interest”
Member’s explanatory statement
This amendment ensures that to qualify for the scientific research exception for data reuse, that research must be in the public interest. This requirement already exists for medical research, but this amendment would apply it to all scientific research wishing to take advantage of the exception.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I thank my noble friend Lady Kidron and the noble Viscount, Lord Camrose, for adding their signatures to my Amendment 14. I withdrew this amendment in Committee, but I am now asking the Minister to consider once again the definition of “scientific research” in the Bill. If he cannot satisfy me in his speech this evening, I will seek the opinion of the House.

I have been worried about the safeguards for defining scientific research since the Bill was published. This amendment will require that the research should be in “the public interest”, which I am sure most noble Lords will agree is a laudable aim and an important safeguard. This amendment has been looked at in the context of the Government’s recent announcements on turning this country into an AI superpower. I am very much a supporter of this endeavour, but across the country there are many people who are worried about the need to set up safeguards for their data. They fear data safety is threatened by this explosion of AI and its inexorable development by the big tech companies. This amendment will go some way to building public trust in the AI revolution.

The vision of Donald Trump surrounded at his inauguration yesterday by tech billionaires, most of whom have until recently been Democrats, puts the fear of God into me. I fear their companies are coming for our data. We have some of the best data in the world, and it needs to be safeguarded. The AI companies are spending billions of dollars developing their foundation models, and they are beholden to their shareholders to minimise the cost of developing these models.

Clause 67 gives a huge fillip to the scientific research community. It exempts research which falls within the definition of scientific research as laid out in the Bill from having to gain new consent from data subjects to reuse millions of points of data.

It costs time and money for the tech companies to get renewed consent from data holders before reusing their data. This is an issue we will discuss further when we debate amendments on scraping data from creatives without copyright licensing. It is clear from our debates in Committee that many noble Lords fear that AI companies will do what they can to avoid either getting consent or licensing data for use in scraping data. Defining their research as scientific will allow them to escape these constraints. I could not be a greater supporter of the wonderful scientific research that is carried out in this country, but I want the Bill to ensure that it really is scientific research and not AI development camouflaged as scientific research.

The line between product development and scientific research is often blurred. Many developers posit efforts to increase model capabilities, efficiency, or indeed the study of their risks, as scientific research. The balance has to be struck between allowing this country to become an AI superpower and exploiting its data subjects. I contend that this amendment will go far to allay public fears of the abuse and use of their data to further the profits and goals of huge AI companies, most of which are based in the United States.

Noble Lords have only to look at the outrage last year at Meta’s use of Instagram users’ data without their consent to train the datasets for its new Llama AI model to understand the levels of concern. There were complaints to regulators, and the ICO posted that Meta

“responded to our request to pause and review plans to use Facebook and Instagram user data to train generative AI”.

However, so far, there has been no official change to Meta’s privacy policy that would legally bind it to stop processing data without consent for the development of its AI technologies, and the ICO has not issued a binding order to stop Meta’s plans to scrape users’ data to train its AI systems. Meanwhile, Meta has resumed reusing subjects’ data without their consent.

I thank the Minister for meeting me and talking through Amendment 14. I understand his concerns that, at a public interest threshold, the definition of scientific research will create a heavy burden on researchers, but I think it is worth the risk in the name of safety. Some noble Lords are concerned about the difficulty of defining “public interest”. However, the ICO has very clear guidelines about what public interest consists of. It states that

“you should broadly interpret public interest in the research context to include any clear and positive public benefit likely to arise from that research”.

It continues:

“The public interest covers a wide range of values and principles about the public good, or what is in society’s best interests. In making the case that your research is in the public interest, it is not enough to point to your own private interests”.


The guidance even includes further examples of research in the public interest, such as

“the advancement of academic knowledge in a given field … the preservation of art, culture and knowledge for the enrichment of society … or … the provision of more efficient or more effective products and services for the public”.

This guidance is already being applied in the Bill to sensitive data and public health data. I contend that if these carefully thought-through guidelines are good enough for health data, they should be good enough for all scientific data.

This view is supported in the EU, where

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests.”

The Minister will tell the House that the data exempted to be used for scientific research is well protected—that it has both the lawfulness test, as set out in the UK GDPR, and a reasonableness test. I am concerned that the reasonableness test in this Bill references

“processing for the purposes of any research that can reasonably be described as scientific, whether publicly or privately funded and whether carried out as a commercial or non-commercial activity”.

Normally, a reasonableness test requires an expert in the context of that research to decide whether it is reasonable to consider it scientific. However, in this Bill, “reasonable” just means that an ordinary person in the street can decide whether the research is reasonable to be considered scientific. This must be a broadening of the threshold of the definition.

It seems “reasonable” in the current climate to ask the Government to include a public interest test before giving the AI companies extensive scope to reuse our data, without getting renewed consent, on the pretext that the work is for scientific research. In the light of possible deregulation of the sector by the new regime in America, it is beholden on this country to ensure that our scientific research is dynamic, but safe. If the Government can bring this reassurance then for millions of people in this country they will increase trust in Britain’s AI revolution. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.

In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.

There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.

The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.

Lord Sentamu Portrait Lord Sentamu (CB)
- View Speech - Hansard - - - Excerpts

My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.

Lord Lucas Portrait Lord Lucas (Con)
- View Speech - Hansard - - - Excerpts

My Lords, the regulator quite clearly needs a standard against which to judge. Public interest is the established one in FOI, medicine and elsewhere. It is the standard that is used when I apply for data under the national pupil database—and quite right too. It works well, it is flexible, it is well understood and it is a decent test to meet. We really ought to insist on it today.

Earl of Erroll Portrait The Earl of Erroll (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I want to add very quickly that we have got a problem here. If someone did take all this private data because we did not put this block on them, and they then had it, it would probably become their copyright and their stuff, which they could then sit on and block other people getting at. This amendment is fairly essential.

Lord Markham Portrait Lord Markham (Con)
- View Speech - Hansard - - - Excerpts

Like the noble Lord, Lord Clement-Jones, I am not going to try to better the excellent speech made by the noble Viscount, Lord Colville.

We debated at much length in Committee the definition of the scientific interest, as it will dictate the breadth of the consent exemption for the data reused. If it is too broad, it could allow data companies—I am thinking specifically of AI programs—to justify data scraping without obtaining consent, should they successfully argue that it constitutes scientific research. However, should we create too narrow a definition, we could stifle commercial research and innovation. This would be disastrous for economic growth and the UK science and technology sector, which is one of our most dynamic sectors and has the potential to become one of the most profitable. We should be looking to support and grow, not hinder. Finding the happy medium here is no small feat, but the amendment tabled by the noble Viscount, Lord Colville of Culross, goes a long way towards achieving this by threading the needle.

By requiring the research to be in the public interest to qualify for the consent exemption for data reuse, we will prevent companies cloaking purely commercial activities for their own ends in the guise of scientific research, while allowing commercial research which will benefit the general public.

This particularly chimes with my time as Health Minister, when we tried to ensure that we could bring the public with us on the use of their health data. We did a lot of focus groups on all of this, and we found that we could have very widespread—70%-plus—public support if we could demonstrate that there really was a medical research benefit from all of this. This amendment is very much in keeping with that. As I say, it threads the needle. That is why we will be strongly supporting the amendment tabled by the noble Viscount, Lord Colville, and we hope he is minded to put the matter to a Division.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble Viscount, Lord Colville, for his amendment and his engagement on this matter. I fully agree with the importance of ensuring that the term “scientific research” is not abused. Clause 67 will help avoid the misuse of the term by introducing a test of whether the research could reasonably be described as scientific. By explicitly requiring a reasonableness test, which is a well-known part of law, the provision is narrowing not broadening the current position.

18:30
The Government believe the test is sufficiently robust to limit misuse of the term “scientific research”. For example, many activities related to marketing or direct product development would not meet the test to be reasonably described as scientific. However, it is important not to disqualify entire fields of activity, because there may be a minority that constitutes genuine scientific research. This is often the case in the development of new medicines, for example, which is why the test needs to be case by case.
The test will not operate alone. There is currently extensive guidance by the ICO on the meaning of “scientific research”. This includes a list of indicators of genuine scientific research and outlines the globally accepted Frascati definition of research. This ICO guidance should be considered when assessing the reasonableness of describing an activity as scientific research; it has to be in the context of the Frascati definition and the ICO’s guidance.
However, the Government’s view is that a requirement for all scientific researchers to undergo an additional formal process to demonstrate that their specific research project is in the public interest would, at best, be a significant and unnecessary burden on our world-class research community. At worst, it would have a chilling effect on research that would ultimately damage public benefit. Much research is driven by curiosity and understanding; defining the precise public benefit at the outset may not be easy or even possible. The public benefit arises many years later. That is the case with the scientific research that we see coming to fruition now: nobody could have known what it would be useful for.
A public interest test is currently a requirement for scientific researchers only in limited circumstances, where there is extra risk that justifies the burden, such as when processing sensitive data under the research condition in the DPA 2018 or undertaking specific public health research. Not all health research is covered, but a specific aspect is. There will be further constraints on researchers through the specific safeguards set out in Clause 85 and the wider requirements of the UK GDPR, such as fairness.
Several people have spoken on this quite passionately and I completely understand why we need to get it right. It is important that companies cannot get hold of data and use it for things that we do not want them to use it for, including marketing and other approaches that will potentially cause harm. In looking after that, we must be mindful not to damage one of our great success stories in this country—scientific research—for which we have unique datasets that are important to improve all sorts of aspects of life.
The Bill will also clear up existing misunderstandings by clarifying, in Clause 71, that a lawful ground is required for all reuse of personal data. That includes scientific research, so it would not be possible to reuse things for a different purpose, in any sphere.
I hope the noble Viscount is content to withdraw this amendment, given these reassurances and the concerns about a significant unintended consequence from going down this route.
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I am grateful and impressed that the Minister has stepped into this controversial sphere of data management at such short notice. I wish his colleague, the noble Baroness, Lady Jones, a swift recovery.

I hope that noble Lords listened to the persuasive speeches that were given across the Benches, particularly from my noble friend Lady Kidron, with her warning about blurring the definition of scientific research. I am also grateful to the Opposition Benches for their support. I am glad that the noble Lord, Lord Markham, thinks that I am threading the needle between research and public trust.

I listened very carefully to the Minister’s response and understand that he is concerned by the heavy burden that this amendment would put on scientific research. I have listened to his explanation of the OECD Frascati principles, which define scientific research. I understand his concern that the rigorous task of demanding that new researchers have to pass a public interest test will stop many from going ahead with research. However, I repeat what I said in my opening speech: there has to be a balance between generating an AI revolution in this country and bringing the trust of the British people along with it. The public interest test is already available for restricted research in this field; I am simply asking for it to be extended to all scientific research.

I am glad that the reasonableness and lawfulness tests are built into Clause 67, but I ask for a test that I am sure most people would support—that the research should have a positive public benefit. On that note, I would like to seek the opinion of the House.

18:35

Division 4

Ayes: 258

Noes: 138

18:48
Clause 68: Consent to processing for the purposes of scientific research
Amendment 15
Moved by
15: Clause 68, page 76, line 16, at end insert—
“(e) the data subject is not a child.”Member's explanatory statement
This amendment ensures the bill maintains the high level of legal protection for children’s data even when the protections offered to adults are lowered.
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I rise to move Amendment 15 and to speak to Amendments 16, 20, 22, 27, 39, 45 and, briefly, government Amendment 40. Together, these amendments offer protections that children were afforded in the Data Protection Act 2018, which passed through this House, and they seek to fix some of the underperformance of the ICO in relation to children’s data.

Before we debate these amendments, it is perhaps worth the Government reflecting on the fact that survey after survey shows that the vast majority—indeed, almost all—of the UK population support stronger digital regulation in respect of children. In refusing to accept these amendments, or, indeed, in replacing them with their own amendments to the same effect, the Government are throwing away one of the successes of the UK Parliament with their newfound enthusiasm for tech with fewer safeguards.

I repeat my belief that lowering data protections for adults is a regressive step for all of us, but for children it is a tragedy that puts them at greater risk of harm—a harm that we in this House have a proud record of seeking to mitigate. The amendments in my name and variously in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones, my noble friend Lord Russell and the noble Baroness, Lady Harding, are essential to preserving the UK’s commitment to child protection and privacy. As the House is well aware, there is cross-party support for child protection. While I will listen very carefully to the Minister, I too am prepared to test the opinion of the House if he has nothing to offer, and I will ask Labour colleagues to consider their responsibility to the nation’s children before they walk through the Lobby.

I will take the amendments out of numerical order, for the benefit of those who have not been following our proceedings. Amendment 22 creates a direct, unambiguous obligation on data processors and controllers to consider the central principles of the age-appropriate design code when processing children’s data. It acknowledges that children of different ages have different capacities and therefore may require different responses. Subsection (2) of the new clause it would insert addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults would experience under the Act when passed.

In the last few weeks, Meta has removed its moderators, and the once-lauded Twitter has become flooded with disinformation and abuse as a result of Elon Musk’s determined deregulation and support of untruth. We have seen the dial move on elections in Romania’s presidential election via TikTok, a rise in scams and the horror of sexually explicit deepfakes, which we will discuss in a later group.

Public trust in both tech and politics is catastrophically low. While we may disagree on the extent to which adults deserve privacy and protection, there are few in this House or the other place who do not believe it is a duty of government to protect children. Amendment 22 simply makes it a requirement that those who control and process children’s data are directly accountable for considering and prioritising their needs. Amendment 39 does the same job in relation to the ICO, highlighting the need to consider that high bar of privacy to which children are entitled, which should be a focus of the commissioner when exercising its regulatory functions, with a particular emphasis on their age and development stage.

Despite Dame Elizabeth Denham’s early success in drafting the age-appropriate design code, the ICO’s track record on enforcement is poor and the leadership has not championed children by robustly enforcing the ADC, or when faced with proposals that watered down child protections in this Bill and its predecessor. We will get to the question of the ICO next week, but I have been surprised by the amount of incoming mail dissatisfied with the regulator and calling on Parliament to demand more robust action. This amendment does exactly that in relation to children.

Government Amendment 40 would require the ICO, when exercising its functions, to consider the fact that children merit specific protections. I am grateful for and welcome this addition as far as it goes; but in light of the ICO’s disappointing track record, clearer and more robust guidance on its obligations is needed.

Moreover, the Government’s proposal is also insufficient because it creates a duty on the ICO only. It does nothing for the controllers and processors, as I have already set out in Amendment 22. It is essential that those who control and process children’s data are directly accountable for prioritising their needs. The consequences when they do not are visible in the anxiety, body dysmorphia and other developmental issues that children experience as a result of their time online.

The Government have usefully introduced an annual report of ICO activities and action. Amendment 45 simply requires them to report the action it has taken specifically in relation to children, as a separate item. Creating better reporting is one of the advances the Government have made; making it possible to see what the ICO has done in regard to children is little more than housekeeping.

This group also includes clause-specific amendments, which are more targeted than Amendment 22. Amendment 15 excludes children from the impact of the proposal to widen the definition of scientific research in Clause 68. Given that we have just discussed this, I may reconsider that amendment. However, Amendment 16 excludes children from the “recognised legitimate interest” provisions in Clause 70. This means that data controllers would still be required to consider and protect children, as currently required under the legitimate interest basis for processing their data.

Amendment 20 excludes children from the new provisions in Clause 71 on purpose limitation. Purpose limitation is at the heart of GDPR. If you ask for a particular purpose and consent to it, extending that purpose is problematic. Amendment 21 ensures that, for children at least, the status quo of data protection law stays the same: that is to say, their personal data can be used only for the purpose for which it was originally collected. If the controller wants to use it in a different way, it must go back to the child—or, if they are under 13, their parent—to ask for further permission.

Finally, Amendment 27 ensures that significant decisions that impact children cannot be made during automated processes unless they are in a child’s best interest. This is a reasonable check and balance on the proposals in Clause 80.

In full, these amendments uphold our collective responsibility to support, protect and make allowances for children as they journey from infancy to adulthood. I met with the Minister and the Bill team, and I thank them for their time. They rightly made the point that children should be participants in the digital world, and I should not seek to exempt them. I suggest to the House that it is the other way round: I will not seek to exempt children if the Government do not seek to put them at risk.

Our responsibility to children is woven into the fabric of our laws, our culture and our behaviour. It has taken two decades to begin to weave childhood into the digital environment, and I am asking the House to make sure we do not take a single retrograde step. The Government have a decision to make. They can choose to please the CEOs of Silicon Valley in the hope that capitulation on regulatory standards will get us a data centre or two; or they can prioritise the best interests of UK children and agree to these amendments, which put children’s needs first. I beg to move.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support all the amendments in this group. I have added my name to Amendments 15, 22, 27 and 45. The only reason my name is not on the other amendments is that others got there before me. As is always the case in our debates on this topic, I do not need to repeat the arguments of the noble Baroness, Lady Kidron. I would just like to make a very high-level point.

19:00
In her last paragraph, the noble Baroness referenced the Government’s concern that we should not seek to exempt children from the digital world. It is really not difficult to encourage children to use the digital world. Any of us who have young children, or young grandchildren, know that in the blink of an eye children pick up a device and access anything they want. We have not got a problem with children accessing digital.
What we have got a very big problem with is how to protect them in that world. Those of us who have worked and campaigned on child internet safety for the last 15 years know how very hard it is to protect our children. I respect the Minister enormously, and I send my good wishes to the noble Baroness, Lady Jones, and wish her a speedy recovery; I know they both care about the issue. However, those of us who have spent a lot of time working in this area have learned that you need to have the detail in the Bill. Many of us worked a decade ago on the age-appropriate design code, which even though it was in a Bill, was incredibly hard to get implemented. We are all learning to our cost already regarding issues in relation to the Online Safety Act that we were told in this Chamber it did not matter whether they were on the face of the Bill and Ofcom would be able to sort them. We are now told that Ofcom does not have the powers because it is not on the face of the Bill.
I urge the Minister to take on board the concern that I know he will hear from all sides of the House that we need substantially to strengthen this Bill’s protection for children, otherwise I fear that, in a year or two, the same group of us will be saying the same thing about another Bill and millions of children will still be unprotected.
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have also put my name to most of the amendments. As with the noble Baroness, Lady Harding, that some of them do not have my name on them is because I arrived too late. Between her and my noble friend Lady Kidron, they have said everything that needs to be said very powerfully. As one who has more recently become involved in a variety of Bills—the Policing and Crime Bill, the Online Safety Bill, and the Victims and Prisoners Bill—in every case trying to fight for and clarify children’s rights, I can say that it has been an uphill battle. But the reason we have been fighting for this is that we have lamentably failed to protect the interests of children for the past two decades as the world has changed around us. All of us who have children or grandchildren, nephews or nieces, or, like me, take part in the Learn with the Lords programme and go into schools, or who deal with mental health charities, are aware of the failure of government and regulators to take account, as the world changed around us, of the effect it would have on children.

In our attempts to codify and clarify in law what the dangers are and what needs to be put in place to try to prevent them, we have had an uphill struggle, regardless of the colour of government. In principle, everyone agrees. In practice, there is always a reason why it is too difficult—or, the easy way out is to say, “We will tell the regulator what our intent is, but we will leave it up to the regulator to decide”.

Our experience to date of the ability of a regulator entirely to take on board what was very clearly the will of Parliament when the Bill became an Act is not being made flesh when it comes to setting out the regulation. Unless it is in an Act and it is made manifestly clear what the desired outcomes are in terms of safety of children, the regulator—because it is difficult to do this well—will not unreasonably decide that if it is too difficult to do, they will settle for something that is not as good as it could be.

What we are trying to do with this set of amendments is to say to the Government up front, “We want this to be as effective as it possibly could be now”. We do not want to come back and rue the consequences of not being completely clear and of putting clear onus of responsibility on the regulators in two or three years’ time, because in another two or three years children will have important parts of their childhood deteriorating quite rapidly, with consequences that will stay with them for the rest of their lives.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.

This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.

The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.

When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.

All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.

Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.

Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.

There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.

That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.

As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.

I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.

On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.

However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.

19:15
I support Amendment 27. It is an important amendment that would ensure that significant decisions impacting a child could not be made solely using automated decision-making unless those decisions were in the child’s best interests. This is a rather ingenious safeguard to ensure that children’s rights and welfare are fully considered in decisions that could affect them. The amendment would ensure that decisions made using automated processes could not be taken unless it was clear that they served the best interests of the child, taking into account their rights and development stage. The amendment would build on the principles already set out in the Data Protection Act 2018, reinforcing the need for extra protections for children.
I support the intent behind Amendment 39, which rightly recognises that children are entitled to a higher standard of protection regarding their personal data. We agree that children’s data requires special consideration at different stages of their development, as they may not fully understand the risks or consequences associated with the processing of their data. This principle is fundamental to safeguarding their rights. Again, though, while we support the overall intent of the amendment, we have concerns about the explicit reference to the UN Convention on the Rights of the Child and general comment 25, as per my comments on the previous amendment.
Amendment 40 rightly emphasises that children merit specific protection when it comes to their personal data, given their vulnerability and the fact that they may be less aware of the risks and consequences associated with such processing. I am reassured to see the Government taking steps to ensure the highest level of protection for children’s data, as that is essential to safeguarding their rights in an increasingly digital world. I support the spirit of the amendment but would characterise it as a minor technical adjustment to ensure clarity. It is certainly important that the Information Commissioner’s duties are clearly set out, and the amendment would help to reinforce the specific protections that children should receive in relation to their personal data.
We on these Benches support Amendment 45, which seeks to ensure that the Information Commissioner’s annual report clearly records activities and actions taken in relation to children’s data protection. That is an important step in enhancing transparency, accountability and understanding of how children’s data is being safeguarded under the regulatory framework. The inclusion of those specific details in the annual report would not only be beneficial for ensuring accountability but reinforce the commitment to prioritising children’s best interests in the regulatory framework. It would provide clarity on the actions taken by the ICO, fostering greater trust in the oversight and enforcement of data protection laws, particularly with respect to children.
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I will speak first to government Amendment 40, tabled in my name, concerning the ICO’s duty relating to children’s personal data. Before that, though, I thank the noble Lords, Lord Stevenson and Lord Russell, the noble Baroness, Lady Harding, and in particular the noble Baroness, Lady Kidron, for such considered debates on this incredibly important issue, both in today’s discussion in the House and in the meetings we have had together. Everyone here wants this to be effective and recognises that we must protect children.

The Government are firmly committed to maintaining high standards of protection for children, which is why they decided not to proceed with measures in the previous Data Protection and Digital Information Bill that would have reduced requirements for data protection impact assessments, prior consultation with the ICO and the designation of data protection officers. The ICO guidance is clear that organisations must complete an impact assessment in relation to any processing activity that uses children’s or other vulnerable people’s data for marketing purposes, profiling or other automated decision-making, or for offering online services directly to children.

The Government also expect organisations which provide online services likely to be accessed by children to continue to follow the standards on age-appropriate design set out in the children’s code. The noble Baroness, Lady Kidron, worked tirelessly to include those provisions in the Data Protection Act 2018 and the code continues to provide essential guidance for relevant online services on how to comply with the data protection principles in respect of children’s data. In addition to these existing provisions, Clause 90 already includes a requirement for the ICO to consider the rights and interests of children when carrying out its functions.

I appreciate the point that the noble Baroness made in Committee about the omission of the first 10 words of recital 38 from these provisions. As such, I am very happy to rectify this through government Amendment 40. The changes we are making to Clause 90 will require the Information Commissioner to consider, where relevant, when carrying out its regulatory functions the fact that children merit special protection with regard to their personal data. I hope noble Lords will support this government amendment.

Turning to Amendment 15 from the noble Baroness, Lady Kidron, which excludes children’s data from Clause 68, I reassure her that neither the protections for adults nor for children are being lowered. Clause 68 faithfully transposes the existing concept of giving consent to processing for an area of scientific research from the current recital. This must be freely given and be fully revokable at any point. While the research purpose initially identified may become more specific as the research progresses, this clause does not permit researchers to use the data for research that lies outside the original consent. As has been highlighted by the noble Viscount, Lord Camrose, excluding children from Clause 68 could have a detrimental effect on health research in children and could unfairly disadvantage them. This is already an area of research that is difficult and underrepresented.

I know that the noble Baroness, Lady Kidron, cares deeply about this but the fact is that if we start to make research in children more difficult—for example, if research on children with a particular type of cancer found something in those children that was relevant to another cancer, this would preclude the use of that data—that cannot be right for children. It is a risk to move and exempt children from this part of the Bill.

Amendment 16 would prevent data controllers from processing children’s data under the new recognised legitimate interests lawful ground. However, one of the main reasons this ground was introduced was to encourage organisations to process personal data speedily when there is a pressing need to do so for important purposes. This could be where there is a need to report a safeguarding concern or to prevent a crime being committed against a child. Excluding children’s data from the scope of the provision could therefore delay action being taken to protect some children—a point also made in the debate.

Amendment 20 aims to prohibit further processing of children’s personal data when it was collected under the consent lawful basis. The Government believe an individual’s consent should not be undermined, whether they are an adult or a child. This is why the Bill sets out that personal data should be used only for the purpose a person has consented to, apart from situations that are in the public interest and authorised by law or to comply with the UK GDPR principles. Safeguarding children or vulnerable individuals is one of these situations. There may be cases where a child’s data is processed under consent by a social media company and information provided by the child raises serious safeguarding concerns. The social media company must be able to further process the child’s data to make safeguarding referrals when necessary. It is also important to note that these public interest exceptions apply only when the controller cannot reasonably be expected to obtain consent.

I know the noble Baroness, Lady Kidron, hoped that the Government might also introduce amendments to require data controllers to apply a higher standard of protection to children’s data than to adults’. The Government have considered Amendment 22 carefully, but requiring all data controllers to identify whether any of the personal data they hold relates to children, and to apply a higher standard to it, would place disproportionate burdens on small businesses and other organisations that currently have no way of differentiating age groups.

Although we cannot pursue this amendment as drafted, my understanding of the very helpful conversations that I have had with the noble Baroness, Lady Kidron, is that she intended for this amendment to be aimed at online services directed at or likely to be accessed by children, not to every public body, business or third sector organisation that might process children’s data from time to time.

I reassure noble Lords that the Government are open to exploring a more targeted approach that focuses on those services that the noble Baroness is most concerned about. The age-appropriate design code already applies to such services and we are very open to exploring what further measures could be beneficial to strengthen protection for children’s data. This point was eloquently raised by the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Stevenson, and is one that we would like to continue. Combined with the steps we are taking in relation to the new ICO duty, which will influence the support and guidance it provides for organisations, we believe this could drive better rates of compliance. I would be very pleased to work with all noble Lords who have spoken on this to try to get this into the right place.

I turn to Amendment 27, tabled by the noble Baroness, Lady Kidron. I agree with her on the importance of protecting children’s rights and interests when undertaking solely automated decision-making. However, we think this amendment, as currently drafted, would cause operational confusion as to when solely automated decision-making can be carried out. Compliance with the reformed Article 22 and the wider data protection legislation will ensure high standards of protection for adults and children alike, and that is what we should pursue.

I now turn to Amendment 39, which would replace the ICO’s children’s duty, and for which I again thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Russell. As a public body, the ICO must adhere to the UK’s commitment to the UN Convention on the Rights of the Child, and we respectfully submit that it is unnecessary to add further wording of this nature to the ICO’s duty. We believe that government Amendment 40, coupled with the ICO’s principal objective to secure an appropriate level of protection, takes account of the fact that the needs of children might not always look the same.

Finally, to address Amendment 45, the Government believe that the Bill already delivers on this aim. While the new annual regulatory action report in Clause 101 will not break down the activity that relates to children, it does cover all the ICO’s regulatory activity, including that taken to uphold the rights of children. This will deliver greater transparency and accountability on the ICO’s actions. Furthermore, Clause 90 requires the ICO to set out in its annual report how it has complied with its statutory duties. This includes the new duty relating to children.

To conclude, I hope that the amendment we tabled today and the responses I have set out reassure noble Lords of our commitment to protect children’s data. I ask noble Lords to support the amendment tabled in my name, and hope that the noble Baroness, Lady Kidron, feels content to withdraw her own.

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - - - Excerpts

Before the Minister sits down, I have some things to say about his words. I did not hear: “agree to bring forward a government amendment at Third Reading”. Those are the magic words that would help us get out of this situation. I have tried to suggest several times that the Government bring forward their own amendment at Third Reading, drafted in a manner that would satisfy the whole House, with the words of the noble Viscount, Lord Camrose, incorporated and the things that are fundamental.

I very much admire the Minister and enjoy seeing him in his place but I say to him that we have been round this a few times now and a lot of those amendments, while rather nerdy in their obsession, are based on lived experience of trying to hold the regulator and the companies to account for the law that we have already passed. I am seeking those magic words before the Minister sits down.

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I have likewise enjoyed working with the noble Baroness. As has been said several times, we are all working towards the same thing, which is to protect children. The age-appropriate design code has been a success in that regard. That is why we are open to exploring what further measures can be put in place in relation to the ICO duty, which can help influence and support the guidance to get that into the right place. That is what I would be more than happy to work on with the noble Baroness and others to make sure that we get it right.

19:30
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

I am presuming a little here that the Minister’s lack of experience in the procedures of the House is holding him back, but I know he is getting some advice from his left. The key thing is that we will not be able to discuss this again in this House unless he agrees that he will bring forward an amendment. We do not have to specify today what that amendment will be. It might not be satisfactory, and we might have to vote against it anyway. But the key is that he has to say this now, and the clerk has to nod in agreement that he has covered the ground properly.

We have done this before on a number of other Bills, so we know the rules. If the Minister can do that, we can have the conversations he is talking about. We have just heard the noble Baroness, Lady Kidron, explain in a very graceful way that this will be from a blank sheet of paper so that we can build something that will command the consensus of the House. We did it on the Online Safety Bill; we can do it here. Please will he say those words?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- Hansard - - - Excerpts

I am advised that I should say that I am happy for the amendment to be brought forward, but not as a government amendment. We are happy to hear an amendment from the noble Baroness at Third Reading.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

Let us be quite clear about this. It does not have to be a government amendment, but the Government Minister has to agree that it can be brought forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

We take that as a yes.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

This is a self-governing House.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I thank the Minister for that very generous offer. I also thank the noble Lord, Lord Stevenson, for his incredible support. I note that, coming from the Government Benches, that is a very difficult thing to do, and I really appreciate it. On the basis that we are to have an amendment at Third Reading, whether written by me with government and opposition help or by the Government, that will address these fundamental concerns set out by noble Lords, I will not press this amendment today.

These are not small matters. The implementation of the age-appropriate design code depends on some of the things being resolved in the Bill. There is no equality of arms here. A child, whether five or 15, is no match for the billions of dollars spent hijacking their attention, their self-esteem and their body. We have to, in these moments as a House, choose David over Goliath. I thank the Minister and all the supporters in this House —the “Lords tech team”, as we have been called in the press. With that, I beg leave to withdraw the amendment.

Amendment 15 withdrawn.
Clause 70: Lawfulness of processing
Amendment 16 not moved.
19:34
Consideration on Report adjourned until not before 8.14 pm.