Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Clement-Jones
Main Page: Lord Clement-Jones (Liberal Democrat - Life peer)Department Debates - View all Lord Clement-Jones's debates with the Department for Science, Innovation & Technology
(1 week, 5 days ago)
Lords ChamberMy Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.
I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.
Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.
The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.
The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.
My Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.
As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.
The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.
My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.
The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.
The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:
“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.
Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.
The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.
I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.
Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.
Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.
The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.
These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?
The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.
I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?
For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.
On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.
On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.
The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.
My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.
My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.
My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.
The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.
My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.
My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.
This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.
The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.
When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.