(3 days, 1 hour ago)
Lords ChamberMy Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.
Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.
We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.
Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.
I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.
My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?
We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.
There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.
My Lords, I support the amendment in the name of the noble Baroness, Lady Kidron, to which I have added my name. I will speak briefly because I wish to associate myself with everything that she has said, as is normal on these topics.
Those of us who worked long and hard on the Online Safety Act had our fingers burnt quite badly when things were not written into the Bill. While I am pleased—and expect to be even more pleased in a few minutes—that the Government are in favour of some form of code of conduct for edtech, whether through the age-appropriate design code or not, I am nervous. As the noble Baroness, Lady Kidron said, every day with Ofcom we are seeing the risk-aversion of our regulators in this digital space. Who can blame them when it appears to be the flavour of the month to say that, if only the regulators change the way they behave, growth will magically come? We have to be really mindful that, if we ask the ICO to do this vaguely, we will not get what we need.
The noble Baroness, Lady Kidron, as ever, makes a very clear case for why it is needed. I would ask the Minister to be absolutely explicit about the Government’s intention, so that we are giving very clear directions from this House to the regulator.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.
Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.
The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.
The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.
I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.
I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.
These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.
No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.
These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.
Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.
As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.
Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.
In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.
The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.
The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.
I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.
I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.
That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.
My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I thank the Minister for that dusty reply. I wonder whether he has been briefed about particular legal cases, such as Killock or Delo, where the judiciary themselves were confused about the nature of the different jurisdictions of tribunal and court. The Minister and, indeed, the noble Viscount, Lord Camrose, seemed to make speeches on the basis that all is wonderful and the jurisdiction of the courts and tribunals is so clearly defined that we do not need a review. That is not the case and, if the Minister were better briefed about the obiter, if not the judgments, in Delo and Killock, he might appreciate that there is considerable confusion about jurisdiction, as several judges have commented.
I am very disappointed by the Minister’s reply. I think that there will be several judges jumping up and down, considering that he has not really looked at the evidence. The Minister always says that he is very evidence-based. I very much hope that he will take another look at this—or, if he does not, that the MoJ will—as there is considerably greater merit in the amendment than he accords. However, I shall not press this to a vote and I beg leave to withdraw the amendment.
My Lords, I too support this. I well remember the passage of the Computer Misuse Act, and we were deeply unhappy about some of its provisions defining hacker tools et cetera, because they had nothing about intention. The Government simply said, “Yes, they will be committing an offence, but we will just ignore it if they are good people”. Leaving it to faceless people in some Civil Service department to decide who is good or bad, with nothing in the Bill, is not very wise. We were always deeply unhappy about it but had to go along with it because we had to have something; otherwise, we could not do anything about hacking tools being freely available. We ended up with a rather odd situation where there is no defence against being a good guy. This is a very sensible amendment to clean up an anomaly that has been sitting in our law for a long time and should probably have been cleaned up a long time ago.
My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.
The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.
The guidance is, however, contradictory. On the one hand, it says that cookie paywalls
“can be compliant with data protection law”
and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.
The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, these amendments have to do with research access for online safety. Having sat on the Joint Committee of the draft Online Safety Bill back in 2021, I put on record that I am delighted that the Government have taken the issue of research access to data very seriously. It was a central plank of what we suggested and it is fantastic that they have done it.
Of the amendments in my name, Amendment 51 would simply ensure that the provisions of Clause 123 are acted on by removing the Government’s discretion as to whether they introduce regulations. It also introduces a deadline of 12 months for the Government to do so. Amendment 53 seeks to ensure that the regulators will enable independent researchers to research how online risks and harms impact different groups, especially vulnerable users, including children. Given the excitements we have already had this evening, I do not propose to press any of them, but I would like to hear from the Minister that he has heard me and that the Government will seek to enshrine the principle of different ages, different stages, different people, when he responds.
I note that the noble Lord, Lord Bethell, who has the other amendments in this group, to which I added my name, is not in his place, but I understand that he has sought—and got—reassurance on his amendments. So there is just one remaining matter on which I would like further reassurance: the scope of the legal privilege exception. A letter from the Minister on 10 January explains:
“The clause restates the existing law on legally privileged information as a reassurance that regulated services will not be asked to break the existing legislation on the disclosure of this type of data”.
It seems that the Minister has veered tantalisingly close to answering my question, but not in a manner that I can quite understand. So I would really love to understand—and I would be grateful to the Minister if he would try to explain to me—how the Government will prevent tech companies using legal privilege as a shield. Specifically, would CCing a lawyer on every email exchange, or having a lawyer in every team, allow companies to prevent legitimate scrutiny of their safety record? I have sat in Silicon Valley headquarters and each team came with its own lawyer—I would really appreciate clarity on this issue. I beg to move.
My Lords, I can only support what the noble Baroness, Lady Kidron, had to say. This is essentially unfinished business from the Online Safety Act, which we laboured in the vineyard to deliver some time ago. These amendments aim to strengthen Clause 123 and try to make sure that this actually happens and that we do not get the outcomes of the kind that the noble Baroness has mentioned.
I, too, have read the letter from the Minister to the noble Lord, Lord Bethell. It is hedged about with a number of qualifications, so I very much hope that the Minister will cut through it and give us some very clear assurances, because I must say that I veer back and forth when I read the paragraphs. I say, “There’s a win”, and then the next paragraph kind of qualifies it, so perhaps the Minister will give us true clarity when he responds.
My Lords, I wanted to add something, having spent a lot of time on Part 3 of the Digital Economy Act, which after many assurances and a couple of years, the Executive decided not to implement, against the wishes of Parliament. It worries me when the Executive suddenly feel that they can do those sorts of things. I am afraid that leopards sometimes do not change their spots, and I would hate to see this happen again, so Amendment 51 immediately appeals. Parliament needs to assert its authority.
(3 days, 1 hour ago)
Lords ChamberMy Lords, I share in the congratulations of my noble friend Lady Owen. It has taken me about 10 years to begin to understand how this House works and it has taken her about 10 minutes.
I want to pursue something which bewilders me about this set of amendments, which is the amendment tabled by the noble Baroness, Lady Gohir. I do not understand why we are talking about a different Bill in relation to audio fakes. Audio has been with us for many years, yet video deepfakes are relatively new. Why are we talking about a different Bill in relation to audio deepfakes?
My Lords, this has been a very interesting debate. I too congratulate the noble Baroness, Lady Owen, on having brought forward these very important amendments. It has been a privilege to be part of her support team and she has proved an extremely persuasive cross-party advocate, including in being able to bring out the team: the noble Baroness, Lady Kidron, the noble Lord, Lord Pannick, who has cross-examined the Minister, and the noble Lord, Lord Stevenson. There is very little to follow up on what noble Lords have said, because the Minister now knows exactly what he needs to reply to.
I was exercised by this rather vague issue of whether the elements that were required were going to come back at Third Reading or in the Commons. I did not think that the Minister was specific enough in his initial response. In his cross-examination, the noble Lord, Lord Pannick, really went through the key elements that were required, such as the no intent element, the question of reasonable excuse and how robust that was, the question of solicitation, which I know is very important in this context, and the question of whether it is really an international law matter. I have had the benefit of talking to the noble Lord, Lord Pannick, and surely the mischief is delivered and carried out here, so why is that an international law issue? There is also the question of deletion of data, which the noble Lord has explained pretty carefully, and the question of timing of knowledge of the offence having been committed.
The Minister needs to describe the stages at which those various elements are going to be contained in a government amendment. I understand that there may be a phasing, but there are a lot of assurances. As the noble Lord, Lord Stevenson, said, is it six or seven? How many assurances are we talking about? I very much hope that the Minister can see the sentiment and the importance we place on his assurances on these amendments, so I very much hope he is going to be able to give us the answers.
In conclusion, as the noble Baroness, Lady Morgan, said—and it is no bad thing to be able to wheel on a former Secretary of State at 9 o’clock in the evening—there is a clear link between gender-based violence and image-based abuse. This is something which motivates us hugely in favour of these amendments. I very much hope the Minister can give more assurance on the audio side of things as well, because we want future legislation to safeguard victims, improve prosecutions and deter potential perpetrators from committing image-based and audio-based abuse crimes.
I thank the Minister and my noble friend Lady Owen for bringing these amendments to your Lordships’ House. Before I speak to the substance of the amendments, I join others in paying tribute to the tenacity, commitment and skill that my noble friend Lady Owen has shown throughout her campaign to ban these awful practices. She not only has argued her case powerfully and persuasively but, as others have remarked, seems to have figured out the machinery of this House in an uncanny way. Whatever else happens, she has the full support of these Benches.
I am pleased that the Government have engaged constructively with my noble friend and are seeking to bring this back at Third Reading. The Minister has been asked some questions and we all look forward with interest to his responses. I know from the speeches that we have heard that I am not alone in this House in believing that we have an opportunity here and now to create these offences, and we should not delay. For the sake of the many people who have been, and will otherwise be, victims of the creation of sexually explicit deepfakes, I urge the Government to continue to work with my noble friend Lady Owen to get this over the line as soon as possible.
I support the amendment, to which I have attached my name, along with the noble Lord, Lord Bassam, and the noble Earl, Lord Clancarty. I declare my interest as a member of DACS, the Design and Artists Copyright Society, and I, too, thank the Minister for meeting us prior to this debate.
Today’s digital landscape presents unique and pressing challenges for visual artists that we can no longer ignore. A 2022 YouGov survey commissioned by DACS uncovered a revealing paradox in our digital culture. While 75% of people regularly access cultural content at least three times a week, with 63% downloading it for free, an overwhelming 72% of the same respondents actively support compensating artists for digital sharing of their work. These figures paint a stark picture of the disconnect between the public’s consumption habits and their ethical convictions about fair compensation.
The Netherlands offers a compelling blueprint for change through DACS’ partner organisation Pictoright. Its innovative private copying scheme has successfully adapted to modern consumption habits while protecting artists’ interests. Consider a common scenario in museums: visitors now routinely photograph artworks instead of purchasing traditional postcards. Under Pictoright’s system, artists receive fair compensation for these digital captures, demonstrating that we can embrace the convenience of digital access without sacrificing creators’ right to earn from their work. This proven model shows that the tension between accessibility and fair compensation is not insurmountable.
The smart fund offers a similar balanced solution for the UK. This approach would protect our cultural ecosystem while serving the interests of creators, platforms and the public alike. I hope the Government will look favourably upon this scheme.
My Lords, I thank the noble Lord, Lord Bassam, for retabling his Committee amendment, which we did not manage to discuss. Sadly, it always appears to be discussed rather late in the evening, but I think that the time has come for this concept and I am glad that the Government are willing to explore it.
I will make two points. Many countries worldwide, including in the EU, have their own version of the smart fund to reward creators and performers for the private copy and use of their works and performances. Our own CMS Select Committee found that, despite the creative industries’ economic contribution—about which many noble Lords have talked—many skilled and successful professional creators are struggling to make a living from their work. The committee recommended that
“the Government work with the UK’s creative industries to introduce a statutory private copying scheme”.
This has a respectable provenance and is very much wanted by the collecting societies ALCS, BECS, Directors UK and DACS. Their letter said that the scheme could generate £250 million to £300 million a year for creatives, at no cost to the Government or to the taxpayer. What is not to like? They say that similar schemes are already in place in 45 countries globally, including most of Europe, and many of them include an additional contribution to public cultural funding. That could be totally game-changing. I very much hope that there is a fair wind behind this proposal.
My Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.
As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.
I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?
We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.
My Lords, I very much encourage the Government to go down this road. Everyone talks about the NHS just because the data is there and organised. If we establish a structure like this, there are other sources of data that we could develop to equivalent value. Education is the obvious one. What works in education? We have huge amounts of data, but we do nothing with it—both in schools and in higher education. What is happening to biodiversity? We do not presently collect the data or use it in the way we could, but if we had that, and if we took advantage of all the people who would be willing to help with that, we would end up with a hugely valuable national resource.
HMRC has a lot of information about employment and career patterns, none of which we use. We worry about what is happening and how we can improve seaside communities, but we do not collect the data which would enable us to do it. We could become a data-based society. This data needs guarding because it is not for general use—it is for our use, and this sort of structure seems a really good way of doing it. It is not just the NHS—there is a whole range of areas in which we could greatly benefit the UK.
My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.
The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.
I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.
We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?
I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.
Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.
My Lords, we have had some discussion already this week on data centres. The noble Lord, Lord Holmes, is absolutely right to raise this broad issue, but I was reassured to hear from the noble Lord, Lord Hunt of Kings Heath, earlier in the week that the building of data centres, their energy requirements and their need may well be included in NESO’s strategic spatial energy plan and the centralised strategic network plan. Clearly, in one part of the forest there is a great deal of discussion about energy use and the energy needs of data centres. What is less clear and, in a sense, reflected in the opportunities plan is exactly how the Government will decide the location of these data centres, which clearly—at least on current thinking about the needs of large language models, AI and so on—will be needed. It is about where they will be and how that will be decided. If the Minister can cast any light on that, we would all be grateful.
I thank my noble friend Lord Holmes of Richmond for moving this amendment. Amendment 59 is an important amendment that addresses some of the key issues relating to large language models. We know that large language models have huge potential, and I agree with him that the Government should keep this under review. Perhaps the noble Baroness, Lady Jones of Whitchurch, would be willing to update the House on the Government’s policy on large language model regulation on her return.
Data centre availability is another emerging issue as we see growth in this sector. My noble friend is absolutely right to bring this to the attention of the House. We firmly agree that we will have a growing need for additional data centres. In Committee, the noble Baroness, Lady Jones, did not respond substantively to Amendments 60 and 66 from my noble friend on data centres, which I believe was—not wholly unreasonably—to speed the Committee to its conclusion just before Christmas. I hope the Minister can give the House a fuller response on this today, as it would be very helpful to hear what the Government’s plans are on the need for additional data centres.
My Lords, I spoke on this before, and I will repeat what I said previously. The only way out of this one is to have two fields against someone: one that we will call “sex” and another that we will call “gender”. I will use the terminology of the noble Lord, Lord Lucas, for this. “Sex” is what you are biologically and were born, and that you cannot change. There are instances where we need to use that field, particularly when it comes to delivering medicine to people—knowing how you treat them medically—and, possibly, in other things such as sports. There are one or two areas where we need to know what they are biologically.
Then we have another field which is called “gender”. In society, in many cases, we wish that people did not have to go around saying that they are not what they were born but what they want to be—but I do not have a problem with that. We could use that field where society decides that people can use it, such as on passports, other documents and identity cards—all sorts of things like that. It does not matter; I am not worried about what someone wants to call themselves or how they want to present themselves to society.
Researchers will have the “sex” field, and they can carry out medical research— they can find out about all the different things related to that—and, societally, we can use the other field for how people wish to project themselves in public. That way we can play around with what you are allowed to use in what scenarios; it allows you to do both. What we need is two fields; it will solve a lot of problems.
My Lords, it is clear that Amendment 67 in the name of the noble Lord, Lord Lucas, is very much of a piece with the amendments that were debated and passed last week. On these Benches, our approach will be exactly the same. Indeed, we can rely on what the Minister said last week, when he gave a considerable assurance:
“I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important”.—[Official Report, 21/1/25; col. 1620.]
That is, the work of the Central Digital and Data Office. We are content to rely on his assurance.
I thank my noble friend Lord Lucas for bringing his Amendment 67, which builds on his previous work to ensure accuracy of data. On these Benches, we agree wholeheartedly with him that the information we have access to—for example, to verify documents—must be accurate. His amendment would allow the Secretary of State to make regulations establishing definitions under the Bill for the purposes of digital verification services, registers of births and deaths, and other provisions. Crucially, this would enable the Government to put measures in place to ensure the consistency of the definitions of key personal attributes, including sex. We agree that consistency and accuracy of data is vital. We supported him on the first day at Report, and, if he pushes his amendment to a Division, we will support him today.
My Lords, as so often, I listened with awe to the noble Baroness. Apart from saying that I agree with her wholeheartedly, which I do, there is really no need for me for me to add anything, so I will not.
My Lords, I too am lost in admiration for the noble Baroness, Lady Kidron—still firing on all cylinders at this time of night. Current law is clearly out of touch with the reality of computer systems. It assumes an untruth about computer reliability that has led to significant injustice. We know that that assumption has contributed to miscarriages of justice, such as the Horizon scandal.
Unlike the amendment in Committee, Amendment 68 does not address the reliability of computers themselves but focuses rather on the computer evidence presented in court. That is a crucial distinction as it seeks to establish a framework for evaluating the validity of the evidence presented, rather than questioning the inherent reliability of computers. We believe that the amendment would be a crucial step towards ensuring fairness and accuracy in legal proceedings by enabling courts to evaluate computer evidence effectively. It offers a balanced approach that would protect the interests of both the prosecution and the defence, ensuring that justice is served. The Government really must move on this.
I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.
The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.
My Lords, I have the very dubious privilege of moving the final amendment on Report to this Bill. This is a probing amendment and the question is: what does retrospectivity mean? The noble Lord, Lord Cameron of Lochiel, asked a question of the noble Baroness, Lady Jones, in Committee in December:
“Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold?”
She replied that
“the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively”.—[Official Report, 10/12/24; cols. GC 435-437.]
But the question is not really whether the lawfulness is retrospective, but whether the changes made in the new law can be applied to any personal data previously collected and already held on the commencement date of the Act—so that is the exam question.
It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.
We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.
I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.
Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.
I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.
I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.
My Lords, I thank the Minister. There is clearly no easy answer. I think we were part-expecting a rather binary answer, but clearly there is not one, so we look forward to the guidance.
But that is a bit worrying for those who have to tackle these issues. I am thinking of the data protection officers who are going to grapple with the Bill in its new form and I suspect that that is going to be quite a task. In the meantime, I withdraw the amendment.
(1 week, 3 days ago)
Lords ChamberMy Lords, last week the Government published the AI Opportunities Action Plan and confirmed that they have accepted or partially accepted all 50 of the recommendations from the report’s author, Matt Clifford. Reading the report, there can be no doubting Government’s commitment to making the UK a welcoming environment for AI companies. What is less clear is how creating the infrastructure and skills pool needed for AI companies to thrive will lead to economic and social benefits for UK citizens.
I am aware that the Government have already said that they will provide further details to flesh out the top-level commitments, including policy and legislative changes over the coming months. I reiterate the point made by many noble Lords in Committee that, if data is the ultimate fuel and infrastructure on which AI is built, why, given that we have a new Government, is the data Bill going through the House without all the strategic pieces in place? This is a Bill flying blind.
Amendment 1 is very modest and would ensure that information that traders were required to provide to customers on goods, services and digital content included information that had been created using AI to build a profile about them. This is necessary because the data that companies hold about us is already a combination of information proffered by us and information inferred, increasingly, by AI. This amendment would simply ensure that all customer data—our likes and dislikes, buying habits, product uses and so on—was disclosable, whether provided by us or a guesstimate by AI.
The Government’s recent statements have promised to “mainline AI into the veins” of the nation. If AI were a drug, its design and deployment would be subject to governance and oversight to ensure its safety and efficacy. Equally, they have said that they will “unleash” AI into our public services, communities and business. If the rhetoric also included commitments to understand and manage the well-established risks of AI, the public might feel more inclined to trust both AI and the Government.
The issue of how the data Bill fails to address AI— and how the AI Opportunities Action Plan, and the government response to it, fail to protect UK citizens, children, the creative industries and so on—will be a theme throughout Report. For now, I hope that the Government can find their way to agreeing that AI-generated content that forms part of a customer’s profile should be considered personal data for the purposes of defining business and customer data. I beg to move.
My Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
My Lords, I support my noble friend. I have a confession to make. Before this Bill came up, I foolishly thought that sex and gender were the same thing. I have discovered that they are not. Gender is not a characteristic defined in UK law. I believe that you are born with a biological sex, as being male or female, and that some people will choose, or need, to have a gender reassignment or to identify as a different gender. I thank the charity Sex Matters, which works to provide clarity on this issue of sex in law.
As my noble friend Lord Lucas said, the digital verification system currently operates on the basis of chosen gender, not of sex at birth. You can change your records on request without even having a gender recognition certificate. That means that, over the last five years, at least 3,000 people have changed their passports to show the wrong sex. Over the last six years, at least 15,000 people have changed their driving licences. The NHS has no records of how many people now have different sexes recorded from those they had at birth. It is thought that perhaps 100,000 people have one sex indicated in one record and a different sex in another. We cannot go on like that.
The consequences of this are really concerning. It means people with mismatched identities risk being flagged up as a synthetic identity risk. It means authorities with statutory safeguarding responsibilities will not be able to assess the risk that they are trying to deal with. It means that illnesses may be misdiagnosed and treatments misprescribed if the wrong sex is stated in someone’s medical records. The police will be unable to identify people if they are looking in the wrong records. Disclosure and Barring Service checks may fail to match individuals with the wrong sex. I hope that the Government will look again at correcting this. It is a really important issue.
My Lords, I will speak to Amendments 7 and 9. Amendment 7 would require the Secretary of State to lay the DVS trust framework before Parliament. Given the volume of sensitive data that digital ID providers will be handling, it is crucial for Parliament to oversee the framework rules governing digital verification service providers.
The amendment is essentially one that was tabled in Committee by the noble Viscount, Lord Camrose. I thought that he expressed this well in Committee, emphasising that such a fundamental framework demands parliamentary approval for transparency and accountability, regardless of the document’s complexity. This is an important framework with implications for data privacy and security, and should not be left solely to the discretion of the Secretary of State.
The DPRRC in its ninth report and the Constitution Committee in its third report of the Session also believed the DVS trust framework should be subject to parliamentary scrutiny; the former because it has legislative effect, and it recommended using the affirmative procedure, which would require Parliament to actively approve the framework, as the Secretary of State has significant power without adequate parliamentary involvement. The latter committee, the Constitution Committee, said:
“We reiterate our statement from our report on the Data Protection and Digital Information Bill that ‘[d]ata protection is a matter of great importance in maintaining a relationship of trust between the state and the individual. Access to personal data is beneficial to the provision of services by the state and assists in protecting national security. However, the processing of personal data affects individual rights, including the right to respect for private life and the right to freedom of expression. It is important that the power to process personal data does not become so broad as to unduly limit those rights’”.
Those views are entirely consistent with the committee’s earlier stance on a similar provision in the previous Data Protection and Digital Information Bill. That was why it was so splendid that the noble Viscount tabled that amendment in Committee. It was like a Damascene conversion.
The noble Baroness, Lady Jones, argued in Committee and in correspondence that the trust framework is a highly technical document that Parliament might find difficult to understand. That is a bit of a red rag to a bull. However, this argument fails to address the core concerns about democratic oversight. The framework aims to establish a trusted digital identity marketplace by setting requirements for providers to gain certification as trusted providers.
I am extremely grateful to the Minister, the Bill team and the department for allowing officials to give the noble Viscount, Lord Camrose, and me a tutorial on the trust framework. It depends heavily on being voluntary in nature, with the UK Accreditation Service essentially overseeing the certifiers, such as BSI, Kantara and the Age Check Certification Scheme, certifying the providers, with the installation of ISO 17065 as the governing standard.
Compliance is assured through the certification process, where services are assessed against the framework rules by independent conformity assessment bodies accredited by the UK Accreditation Service, and the trust framework establishes rules and standards for digital identity verification but does not directly contain specific provision for regulatory oversight or for redress mechanisms such as a specific ombudsman service, industry-led dispute resolution or set contract terms for consumer redress or enforcement powers. The Government say, however, that they intend to monitor the types of complaints received. Ultimately, the scope of the framework is limited to the rules providers must follow in order to remain certificated and it does not address governance matters.
Periodic certification alone is not enough to ensure ongoing compliance and highlights the lack of an independent mechanism to hold the Secretary of State accountable. The noble Baroness, Lady Jones, stated in Committee that the Government preferred a light-touch approach to regulating digital verification services. She believed that excessive parliamentary scrutiny would hinder innovation and flexibility in this rapidly evolving sector.
The Government have consistently emphasised that they have no plans to introduce mandatory digital IDs or ID cards The focus is on creating a secure and trusted system that gives citizens more choice and control over their data. The attributes trust framework is a crucial step towards achieving the goal of a secure, trusted and innovative digital identity market—all the more reason to get the process for approval right.
These services will inevitably be high-profile. Digital ID is a sensitive area which potentially also involves age verification. These services could have a major impact on data privacy and security. Public debate on such a critical issue is crucial to build trust and confidence in these systems. Laying the DVS trust framework before Parliament would allow for a wider range of voices and perspectives to be heard, ensuring a more robust and democratic approval process.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
I am sorry to interrupt the Minister. He is equating digital identity theft to fraud, and that is not always the case. Is that the advice that he has received?
The advice is that digital identity theft would be captured by those Acts. Therefore, there is no need for a specific offence. However, as I said, the Government are taking steps to tackle this and will support the Action Fraud service as a way to deal with it, even though I agree that not everything falls as fraud under that classification.
I am sorry to interrupt the Minister again, but could he therefore confirm that, by reiterating his previous view that the Secretary of State should not have to bring the framework to Parliament, he disagrees with both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, both of which made the same point on this occasion and on the previous Bill—that Parliament should look at the trust framework?
For the reasons that I have given, I think that the trust framework is a technical document and one best dealt with in this technical form. It is built on other assurance processes, with the United Kingdom Accreditation Service overseeing the conformity accreditation bodies that will test the digital verification services. In this case, our view is that it does not need to come under parliamentary scrutiny.
On Amendments 6 and 8 from the noble Lord, Lord Lucas, I am absolutely behind the notion that the validity of the data is critical. We have to get this right. Of course, the Bill itself takes the data from other sources, and those sources have authority to get the information correct, but it is important, for a digital service in particular, that this is dealt with very carefully and that we have good assurance processes.
On the specific point about gender identity, the Bill does not create or prescribe new ways in which to determine that, but work is ongoing to try to ensure that there is consistency and accuracy. The Central Digital and Data Office has started to progress work on developing data standards and key entities and their attributes to ensure that the way data is organised, stored and shared is consistent between public authorities. Work has also been commenced via the domain expert group on the person entity, which has representations from the Home Office, HMRC, the Office for National Statistics—importantly—NHS England, the Department for Education, the Ministry of Justice, the Local Government Association and the Police Digital Service. The group has been established as a pilot under the Data Standards Authority to help to ensure consistency across organisations, and specific pieces of work are going on relating to gender in that area.
The measures in Part 2 are intended to help secure the reliability of the process through which citizens can verify their identity digitally. They do not intervene in how government departments record and store identity data. In clarifying this important distinction, and with reference to the further information I will set out, I cannot support the amendments.
My Lords, I support the conclusions of the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, and I beg leave to seek the opinion of the House.
My Lords, Amendments 10 and 12 seek to amend Clauses 56 and 58, which form part of the national underground asset register provisions. These two minor, technical amendments address a duplicate reference to “the undertaker’s employees” and replace it with the correct reference to “the contractor’s employees”. I reassure noble Lords that the amendments do not have a material policy effect and are intended to correct the drafting. I beg to move.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I support my noble friend Lord Colville. He has made an excellent argument, and I ask noble Lords on the Government Benches to think about it very carefully. If it is good enough for health data, it is good enough for the rest of science. In the interest of time, I will give an example of one of the issues, rather than repeat the excellent argument made by my noble friend.
In Committee, I asked the Government three times whether the cover of scientific research could be used, for example, to market-test ways to hack human responses to dopamine in order to keep children online. In the Minister’s letter, written during Committee, she could not say that the A/B testing of millions of children to make services more sticky—that is, more addictive—would not be considered scientific, but rather that the regulator, the ICO, could decide on a case-by-case basis. That is not good enough.
There is no greater argument for my noble friend Lord Colville’s amendment than the fact that the Government are unable to say if hacking children’s attention for commercial gain is scientific or not. We will come to children and child protection in the Bill in the next group, but it is alarming that the Government feel able to put in writing that this is an open question. That is not what Labour believed in opposition, and it is beyond disappointing that, now in government, Labour has forgotten what it then believed. I will be following my noble friend through the Lobby.
My Lords, it is almost impossible to better the arguments put forward by the noble Viscount, Lord Colville, and the noble Baroness, Lady Kidron, so I am not even going to try.
The inclusion of a public interest requirement would ensure that the use of data for scientific research would serve a genuine societal benefit, rather than primarily benefiting private interests. This would help safeguard against the misuse of data for purely commercial purposes under the guise of research. The debate in Committee highlighted the need for further clarity and stronger safeguards in the Bill, to ensure that data for scientific research genuinely serves the public interest, particularly concerning the sensitive data of children. The call for a public interest requirement reflects the desire to ensure a balance between promoting research and innovation and upholding the rights and interests of data subjects. I very much hope that the House will support this amendment.
My Lords, we are playing a bit of Jack-in-the-box. When I was being taught law by a wonderful person from Gray’s Inn, who was responsible for drafting the constitution of Uganda’s independence, Sir Dingle Foot, he said a phrase which struck me, and which has always stayed with me: law is a statement of public policy. The noble Viscount, Lord Coville, seeks that if there is to be scientific work, it must be conducted “in the public interest”. Law simply does not express itself for itself; it does it for the public, as a public policy. It would be a wonderful phrase to include, and I hope the Minister will accept it so that we do not have to vote on it.
My Lords, I was one of those who was up even earlier than the noble Baroness, Lady Harding, and managed to get my name down on these amendments. It puts me in a rather difficult position to be part of the government party but to seek to change what the Government have arrived at as their sticking position in relation to this issue in particular—and indeed one or two others, but I have learned to live with those.
This one caught my eye in Committee. I felt suddenly, almost exactly as the noble Lord, Lord Russell said, a sense of discontinuity in relation to what we thought it was in the Government’s DNA—that is, to bring forward the right solution to the problems that we have been seeking to change in other Bills. With the then Online Safety Bill, we seemed to have an agreement around the House about what we wanted, but every time we put it back to the officials and people went away with it and came back with other versions, it got worse and not better. How children are dealt with and how important it is to make sure that they are prioritised appears to be one of those problems.
The amendments before us—and I have signed many of them, because I felt that we wanted to have a good and open debate about what we wanted here—do not need to be passed today. It seems to me that the two sides are, again, very close in what we want to achieve. I sensed from the excellent speech of the noble Baroness, Lady Kidron, that she has a very clear idea of what needs to go into this Bill to ensure that, at the very least, we do not diminish the sensible way in which we drafted the 2018 Bill. I was part of that process as well; I remember those debates very well. We got there because we hammered away at it until we found a way of finding the right words that bridged the two sides. We got closer and closer together, but sometimes we had to go even beyond what the clerks would feel comfortable with in terms of government procedure to do that. We may be here again.
When he comes to respond, can the Minister commit to us today in this House that he will bring back at Third Reading a version of what he has put forward—which I think we all would say does not quite go far enough; it needs a bit more, but not that much more—to make it meet with where we currently are and where, guided by the noble Baroness, Lady Kidron, we should be in relation to the changing circumstances in both the external world and indeed in our regulator, which of course is going to go through a huge change as it reformulates itself? We have an opportunity, but there is also a danger that we do not take it. If we weaken ourselves now, we will not be in the right position in a few years’ time. I appeal to my noble friend to think carefully about how he might manage this process for the best benefit of all of us. The House, I am sure, is united about where we want to get to. The Bill does not get us there. Government Amendment 18 is too modest in its approach, but it does not need a lot to get it there. I think there is a way forward that we do not need to divide on. I hope the Minister will take the advice that has been given.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.
(1 week, 3 days ago)
Lords ChamberI apologise for interrupting the Minister, in what sounded almost like full flow. I am sure that he was so eager to move his amendment.
In moving Amendment 17, I will speak also to Amendment 21. These aim to remove the Secretary of State’s power to override primary legislation and modify key aspects of the UK data protection law via statutory instruments. They are similar to those proposed by me to the previous Government’s Data Protection and Digital Information Bill, which the noble Baroness, Lady Jones of Whitchurch, then in opposition, supported. These relate to Clauses 70(4) and 71(5).
There are a number of reasons to support accepting these amendments. The Delegated Powers and Regulatory Reform Committee has expressed concerns about the broad scope of the Secretary of State’s powers, as it did previously in relation to the DBS scheme. It recommended removing the power from the previous Bill, and in its ninth report it maintains this view for the current Bill. The Constitution Committee has said likewise; I will not read out what it said at the time, but I think all noble Lords know that both committees were pretty much on the same page.
The noble Baroness, Lady Jones, on the previous DPDI Bill, argued that there was no compelling reason for introducing recognised legitimate interests. On these Benches, we agree. The existing framework already allows for data sharing with the public sector and data use for national security, crime detection and safeguarding vulnerable individuals. However, the noble Baroness, in her ministerial capacity, argued that swift changes might be needed—hence the necessity for the Secretary of State’s power. Nevertheless, the DPRRC’s view is that the grounds for the lawful processing of personal data are fundamental and should not be subject to modification by subordinate legislation.
The letter from the Minister, the noble Lord, Lord Vallance, to the Constitution Committee and the DPRRC pretty much reiterates those arguments. I will not go through all of it again, but I note, in closing, that in his letter he said:
“I hope it will reassure the Committee that the power will be used only when necessary and in the public interest”.
He could have come forward with an amendment to that effect at any point in the passage of the Bill, but he has not. I hope that, on reflection—in the light of both committees’ repeated recommendations, the potential threats to individual privacy and data adequacy, and the lack of strong justification for these powers—the Minister will accept these two amendments. I beg to move.
My Lords, I must inform the House that if Amendment 17 is agreed to, I cannot call Amendment 18 for reasons of pre-emption.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, I thank the Minister for that considered reply. It went into more detail than the letter he sent to the two committees, so I am grateful for that, and it illuminated the situation somewhat. But at the end of the day, the Minister is obviously intent on retaining the regulation-making power.
I thank the noble Viscount, Lord Camrose, for his support—sort of—in principle. I am not quite sure where that fitted; it was post-ministerial language. I think he needs to throw off the shackles of ministerial life and live a little. These habits die hard but in due course, he will come to realise that there are benefits in supporting amendments that do not give too much ministerial power.
Turning to one point of principle—I am not going to press either amendment—it is a worrying trend that both the previous Government and this Government seem intent on simply steamrollering through powers for Secretaries of State in the face of pretty considered comment by House of Lords committees. This trend has been noted, first for skeletal Bills and secondly for Bills that, despite being skeletal, include a lot of regulation-making power for Secretaries of State, and Henry VIII powers. So I just issue a warning that we will keep returning to this theme and we will keep supporting and respecting committees of this House, which spend a great deal of time scrutinising secondary legislation and warning of overweening executive power. In the meantime, I beg leave to withdraw Amendment 17.
My Lords, I do not think the noble Baroness, Lady Harding, lost the audience at all; she made an excellent case. Before speaking in support of the noble Baroness, I should say, “Blink, and you lose a whole group of amendments”. We seem to have completely lost sight of the group starting with Amendment 19—I know the noble Lord, Lord Holmes, is not here—and including Amendments 23, 74 and government Amendment 76, which seems to have been overlooked. I suggest that we degroup next week and come back to Amendments 74 and 76. I do not know what will happen to Amendment 23; I am sure there is a cunning plan on the Opposition Front Bench to reinstate that in some shape or form. I just thought I would gently point that out, since we are speeding along and forgetting some of the very valuable amendments that have been tabled.
I very much support, as I did in Committee, what the noble Baroness, Lady Harding, said about Amendment 24, which aims to clarify the use of open electoral register data for direct marketing. The core issue is the interpretation of Article 14 of the GDPR, specifically regarding the disproportionate effort exemption. The current interpretation, influenced by recent tribunal rulings, suggests that companies using open electoral register—OER—data would need to notify every individual whose data is used, even if they have not opted out. As the noble Baroness, Lady Harding, implied, notifying millions of individuals who have not opted out is unnecessary and burdensome. Citizens are generally aware of the OER system, and those who do not opt out reasonably expect to receive direct marketing materials. The current interpretation leads to excessive, unhelpful notifications.
There are issues about financial viability. Requiring individual notifications for the entire OER would be financially prohibitive for companies, potentially leading them to cease using the register altogether. On respect for citizens’ choice, around 37% of voters choose not to opt out of OER use for direct marketing, indicating their consent to such use. The amendment upholds this choice by exempting companies from notifying those individuals, which aligns with the GDPR’s principle of respecting data subject consent.
On clarity and certainty, Amendment 24 provides clear exemptions for OER data use, offering legal certainty for companies while maintaining data privacy and adequacy. This addresses the concerns about those very important tribunal rulings creating ambiguity and potentially disrupting legitimate data use. In essence, Amendment 24 seeks to reconcile the use of OER data for direct marketing with the principles of transparency and data subject rights. On that basis, we on these Benches support it.
I turn to my amendment, which seeks a soft opt-in for charities. As we discussed in Committee, a soft opt-in in Regulation 22 of the Privacy and Electronic Communications (EC Directive) Regulations 2003 allows organisations to send electronic mail marketing to existing customers without their consent, provided that the communication is for similar products and services and the messages include an “unsubscribe” link. The soft opt-in currently does not apply to non-commercial organisations such as charities and membership organisations. The Data & Marketing Association estimates that extending the soft opt-in to charities would
“increase … annual donations in the UK by £290 million”.
Extending the soft opt-in as proposed in both the Minister’s and my amendment would provide charities with a level playing field, as businesses have enjoyed this benefit since the introduction of the Privacy and Electronic Communications Regulations. Charities across the UK support this change. For example, the CEO of Mind stated:
“Mind’s ability to reach people who care about mental health is vital. We cannot deliver life changing mental health services without the financial support we receive from the public”.
Oxfam’s individual engagement director noted:
“It’s now time to finally level the playing field for charities too and to allow them to similarly engage their passionate and committed audiences”.
Topically, too, this amendment is crucial to help charities overcome the financial challenges they face due to the cost of living crisis and the recent increase in employer national insurance contributions. So I am delighted, as I know many other charities will be, that the Government have proposed Amendment 49, which achieves the same effect as my Amendment 50.
My Lords, I declare an interest that my younger daughter works for a charity which will rely heavily on the amendments that have just been discussed by the noble Lord, Lord Clement-Jones.
I want to explain that my support for the amendment moved by the noble Baroness, Lady Harding, was not inspired by any quid pro quo for earlier support elsewhere —certainly not. Looking through the information she had provided, and thinking about the issue and what she said in her speech today, it seemed there was an obvious injustice happening. It seemed wrong, in a period when we were trying to support growth, that we cannot see our way through it. It was in that spirit that I suggested we should push on with it and bring it back on Report, and I am very happy to support it.
My Lords, I will speak to Amendments 28, 29, 33, 34 and 36. I give notice that I will only speak formally to Amendment 33. For some reason, it seems to have escaped this group and jumped into the next one.
As we discussed in Committee, and indeed on its previous versions, the Bill removes the general prohibition on solely automated decisions and places the responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. The Bill also amends Article 22 of the GDPR so that protection against solely automated decision-making applies only to decisions made using sensitive data such as race, religion and health data. This means that decisions based on other personal data, such as postcode, nationality, sex or gender, would be subject to weaker safeguards, increasing the risk of unfair or discriminatory outcomes. This will allow more decisions with potentially significant impacts to be made without human oversight, even if they do not involve sensitive data. This represents a significant weakening of existing protection against unsafe automated decision-making. That is why I tabled Amendment 33 to leave out the whole clause.
However, the Bill replaces the existing Article 22 with Articles 22A to 22D, which redefine automated decisions and allow for solely automated decision-making in a broader range of circumstances. This change raises concerns about transparency and the ability of individuals to challenge automated decisions. Individuals may not be notified about the use of ADM, making it difficult to exercise their rights. Moreover, the Bill’s safeguards for automated decisions, particularly in the context of law enforcement, are weaker compared with the protections offered by the existing Article 22. This raises serious concerns about the potential for infringement of people’s rights and liberties in areas such as policing, where the use of sensitive data in ADM could become more prevalent. Additionally, the lack of clear requirements for personalised explanations about how ADM systems reach decisions further limits individuals’ understanding of and ability to challenge outcomes.
In the view of these Benches, the Bill significantly weakens safeguards around ADM, creates legal uncertainty due to vague definitions, increases the risk of discrimination, and limits transparency and redress for individuals—ultimately undermining public trust in the use of these technologies. I retabled Amendments 28, 29, 33 and 34 from Committee to address continuing concerns regarding these systems. The Bill lacks clear definitions of crucial terms such as “meaningful human involvement” and, similarly, “significant effect”, which are essential for determining the scope of protection. That lack of clarity could lead to varying interpretations and inconsistencies in application, creating legal uncertainty for individuals and organisations.
In Committee, the noble Baroness, Lady Jones, emphasised the Government’s commitment to responsible ADM and argued against defining meaningful human involvement in the Bill, but instead for allowing the Secretary of State to define those terms through delegated legislation. However, that raises concerns about transparency and parliamentary oversight, as these are significant policy decisions. Predominantly automated decision-making should be included in Clause 80, as in Amendment 28, as a decision may lack meaningful human involvement and significantly impact individuals’ rights. The assertion by the noble Baroness, Lady Jones, that predominantly automated decisions inherently involve meaningful human oversight can be contested, particularly given the lack of a clear definition of such involvement in the Bill.
There are concerns that changes in the Bill will increase the risk of discrimination, especially for marginalised groups. The noble Baroness, Lady Jones, asserted in Committee that the data protection framework already requires adherence to the Equality Act. However, that is not enough to prevent algorithmic bias and discrimination in ADM systems. There is a need for mandatory bias assessments of all ADM systems, particularly those used in the public sector, as well as for greater transparency in how those systems are developed and deployed.
We have not returned to the fray on the ATRS, but it is clear that a statutory framework for the ATRS is necessary to ensure its effectiveness and build trust in public sector AI. Despite the assurance by the noble Baroness, Lady Jones, that the ATRS is mandatory for government departments, its implementation relies on a cross-government policy mandate that lacks statutory backing and may prove insufficient to ensure the consistent and transparent use of algorithmic tools.
My Amendment 34 seeks to establish requirements for public sector organisations using ADM systems. Its aim is to ensure transparency and accountability in the use of these systems by requiring public authorities to publish details of the systems they use, including the purpose of the system, the data used and any mitigating measures to address risks. I very much welcome Amendment 35 from the noble Baroness, Lady Freeman, which would improve it considerably and which I have also signed. Will the ATRS do as good a job as that amendment?
Concerns persist about the accessibility and effectiveness of this mechanism for individuals seeking redress against potentially harmful automated decisions. A more streamlined and user-friendly process for challenging automated decisions is needed in the in the age of increasing ADM. The lack of clarity and specific provisions in the Bill raises concerns about its effectiveness in mitigating the risks posed by automated systems, particularly in safeguarding vulnerable groups such as children.
My Amendment 36 would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the Information Commissioner’s Office, or to clearly set out their reasoning as to why that is not required within six months of the Act passing. The amendment is aimed at addressing the ambiguity surrounding “meaningful human involvement” and ensuring that there is a clear understanding of what constitutes appropriate human oversight in ADM processes.
I am pleased that the Minister has promised a code of practice, but what assurance can he give regarding the forthcoming ICO code of practice about automated decision-making? How will it provide clear guidance on how to implement and interpret the safeguards for ADM, and will it address the definition of meaningful human involvement? What forms of redress will it require to be established? What level of transparency will be required? A code of conduct offered by the Minister would be acceptable, provided that the Secretary of State did not have the sole right to determine the definition of meaningful human involvement. I therefore hope that my Amendment 29 will be accepted alongside Amendment 36, because it is important that the definition of such a crucial term should be developed independently, and with the appropriate expertise, to ensure that ADM systems are used fairly and responsibly, and that individual rights are adequately protected.
Amendments 31 and 32 from the Opposition Front Bench seem to me to have considerable merit, particularly Amendment 32, in terms of the nature of the human intervention. However, I confess to some bafflement as to the reasons for Amendment 26, which seeks to insert the OECD principles set out in the AI White Paper. Indeed, they were the G20 principles as well and are fully supportable in the context of an AI Bill, for instance, and I very much hope that will form Clause 1 of a new AI Bill going forward. I am not going to go into great detail, but I wonder whether those principles are already effectively addressed in data protection legislation. If we are not careful, we are going to find a very confused regulator in these circumstances. So, although there is much to commend the principles as such, whether they are a practical proposition in a Bill of this nature is rather moot.
My Lords, I support Amendment 34 from the noble Lord, Lord Clement-Jones, and will speak to my own Amendment 35, which amends it. When an algorithm is being used to make important decisions about our lives, it is vital that everyone is aware of what it is doing and what data it is based on. On Amendment 34, I know from having had responsibility for algorithmic decision support tools that users are very interested in how recent the data it is based on is, and how relevant it is to them. Was the algorithm derived from a population that included people who share their characteristics? Subsection (1)(c)(ii) of the new clause proposed in Amendment 34 refers to regular assessment of the data used by the system. I would hope that this would be part of the meaningful explanation to individuals to be prescribed by the Secretary of State in subsection (1)(b).
Amendment 35 would add to this that it is vital that all users and procurers of such a system understand its real-world efficacy. I use the word “efficacy” rather than “accuracy” because it might be difficult to define accuracy with regard to some of these systems. The procurer of any ADM system should want to know how accurate it is using realistic testing, and users should also be aware of those findings. Does the system give the same outcome as a human assessor 95% or 60% of the time? Is that the same for all kinds of queries, or is it more accurate for some groups of people than others? The efficacy is really one of the most important aspects and should be public. I have added an extra line that ensures that this declaration of efficacy would be kept updated. One would hope that the performance of any such system would be monitored anyway, but this ensures that the outcomes of such monitoring are in the public domain.
In Committee, the Minister advised us to wait for publication of the algorithmic transparency records that were released in December. Looking at them, I think they make clear the much greater need for guidance and stringency in what should be mandated. I will give two short examples from those records. For the DBT: Find Exporters algorithm, under “Model performance” it merely says that it uses Brier scoring and other methods, without giving any actual results of that testing to indicate how well it performs. It suggests looking at the GitHub pages. I followed that link, and it did not allow me in. The public have no access to those pages. This is why these performance declarations need to be mandated and forced to be in the public domain.
In the second example, the Cambridgeshire trial of an externally supplied object detection system just cites the company’s test data, claiming average precision in a “testing environment” of 43.5%. This does not give the user a lot of information. Again, it links to GitHub pages produced by the supplier. Admittedly, this is a trial, so perhaps the Cambridgeshire Partnership will update it with its real-world trial data. But that is why we need to ensure annual updates of performance data and ensure that that data is not just a report of the supplier’s claims in a test environment.
The current model of algorithmic transparency records is demonstrably not fit for purpose, and these provisions would help put them on a much firmer footing. These systems, after all, are making life-changing decisions for all of us and we all need to be sure how well they are doing and put appropriate levels of trust in them accordingly.
I start with Amendment 26, tabled by the noble Viscount, Lord Camrose. As he said in Committee, a principles-based approach ensures that our rules remain fit in the face of fast-evolving technologies by avoiding being overly prescriptive. The data protection framework achieves this by requiring organisations to apply data protection principles when personal data is processed, regardless of the technology used.
I agree with the principles that are present for AI, which are useful in the context in which they were put together, but introducing separate principles for AI could cause confusion around how data protection principles are interpreted when using other technologies. I note the comment that there is a significant overlap between the principles, and the comment from the noble Viscount that there are situations in which one would catch things and another would not. I am unable to see what those particular examples are, and I hope that the noble Viscount will agree with the Government’s rationale for seeking to protect the framework’s technology-neutral set of principles, rather than having two separate sets.
Amendment 28 from the noble Lord, Lord Clement-Jones, would extend the existing safeguards for decisions based on solely automated processing to decisions based on predominantly automated processing. These safeguards protect people when there is no meaningful human involvement in the decision-making. The introduction of predominantly automated decision-making, which already includes meaningful human involvement—and I shall say a bit more about that in a minute—could create uncertainty over when the safeguards are required. This may deter controllers from using automated systems that have significant benefits for individuals and society at large. However, the Government agree with the noble Viscount on strengthening the protections for individuals, which is why we have introduced a definition for solely automated decision-making as one which lacks “meaningful human involvement”.
I thank noble Lords for Amendments 29 and 36 and the important points raised in Committee on the definition of “meaningful human involvement”. This terminology, introduced in the Bill, goes beyond the current UK GDPR wording to prevent cursory human involvement being used to rubber stamp decisions as not being solely automated. The point at which human involvement becomes meaningful is context specific, which is why we have not sought to be prescriptive in the Bill. The ICO sets out in its guidance its interpretation that meaningful human involvement must be active: someone must review the decision and have the discretion to alter it before the decision is applied. The Government’s introduction of “meaningful” into primary legislation does not change this definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do this, or to advise Parliament or the Government if it considers that the law needs clarification. The Government also acknowledge that there may be a need to provide further legal certainty in future. That is why there are a number of regulation-making powers in Article 22D, including the power to describe meaningful human involvement or to add additional safeguards. These could be used, for example, to impose a timeline on controllers to provide human intervention upon the request of the data subject, if evidence suggested that this was not happening in a timely manner following implementation of these reforms. Any regulations must follow consultation with the ICO.
Amendment 30 from the noble Baroness, Lady Kidron, would prevent law enforcement agencies seeking the consent of a young person to the processing of their special category or sensitive personal data when using automated decision-making. I thank her for this amendment and agree about the importance of protecting the sensitive personal data of children and young adults. We believe that automated decision-making will continue to be rarely deployed in the context of law enforcement decision-making as a whole.
Likewise, consent is rarely used as a lawful basis for processing by law enforcement agencies, which are far more likely to process personal data for the performance of a task, such as questioning a suspect or gathering evidence, as part of a law enforcement process. Where consent is needed—for example, when asking a victim for fingerprints or something else—noble Lords will be aware that Clause 69 clearly defines consent under the law enforcement regime as
“freely given, specific, informed and unambiguous”
and
“as easy … to withdraw … as to give”.
So the tight restrictions on its use will be crystal clear to law enforcement agencies. In summary, I believe the taking of an automated decision based on a young person’s sensitive personal data, processed with their consent, to be an extremely rare scenario. Even when it happens, the safeguards that apply to all sensitive processing will still apply.
I thank the noble Viscount, Lord Camrose, for Amendments 31 and 32. Amendment 31 would require the Secretary of State to publish guidance specifying how law enforcement agencies should go about obtaining the consent of the data subject to process their data. To reiterate a point made by my noble friend Lady Jones in Committee, Clause 69 already provides a definition of “consent” and sets out the conditions for its use; they apply to all processing under the law enforcement regime, not just automated decision-making, so the Government believe this amendment is unnecessary.
Amendment 32 would require the person reviewing an automated decision to have sufficient competence and authority to amend the decision if required. In Committee, the noble Viscount also expressed the view that a person should be “suitably qualified”. Of course, I agree with him on that. However, as my noble friend Lady Jones said in Committee, the Information Commissioner’s Office has already issued guidance which makes it clear that the individual who reconsiders an automated decision must have the “authority and competence” to change it. Consequently, the Government do not feel that it is necessary to add further restrictions in the Bill as to the type of person who can carry out such a review.
The noble Baroness, Lady Freeman, raised extremely important points about the performance of automated decision-making. The Government already provide a range of products, but A Blueprint for Modern Digital Government, laid this morning, makes it clear that part of the new digital centre’s role will be to offer specialist insurance support, including, importantly in relation to this debate,
“a service to rigorously test models and products before release”.
That function will be in place and available to departments.
On Amendments 34 and 35, my noble friend Lady Jones previously advised the noble Lord, Lord Clement-Jones, that the Government would publish new algorithmic transparency recording standard records imminently. I am pleased to say that 14 new records were published on 17 December, with more to follow. I accept that these are not yet in the state in which we would wish them to be. Where these amendments seek to ensure that the efficacy of such systems is evaluated, A Blueprint for Modern Digital Government, as I have said, makes it clear that part of the digital centre’s role will be to offer such support, including this service. I hope that this provides reassurance.
My Lords, before the Minister sits down, I was given considerable assurance between Committee and Report that a code of practice, drawn up with the ICO, would be quite detailed in how it set out the requirements for those engaging in automated decision-making. The Minister seems to have given some kind of assurance that it is possible that the ICO will come forward with the appropriate provisions, but he has not really given any detail as to what that might consist of and whether that might meet some of the considerations that have been raised in Committee and on Report, not least Amendments 34 and 35, which have just been discussed as if the ATRS was going to cover all of that. Of course, any code would no doubt cover both the public and private sectors. What more can the Minister say about the kind of code that would be expected? We seem to be in somewhat of a limbo in this respect.
I apologise; I meant to deal with this at the end. I think I am dealing with the code in the next group.
My Lords, we have waited with bated breath for the Minister to share his hand, and I very much hope that he will reveal the nature of his bountiful offer of a code of practice on the use of automated decision-making.
I will wear it as a badge of pride to be accused of introducing an analogue concept by the noble Viscount, Lord Camrose. I am still keen to see the word “predominantly” inserted into the Bill in reference to automated decision-making.
As the Minister can see, there is considerable unhappiness with the nature of Clause 80. There is a view that it does not sufficiently protect the citizen in the face of automated decision-making, so I hope that he will be able to elaborate further on the nature of those protections.
I will not steal any of the thunder of the noble Baroness, Lady Kidron. For some unaccountable reason, Amendment 33 is grouped with Amendment 41. The groupings on this Bill have been rather peculiar and at this time of night I do not think any long speeches are in order, but it is important that we at least have some debate about the importance of a code of conduct for the use of AI in education, because it is something that a great many people in the education sector believe is necessary. I beg to move.
My Lords, I shall speak to Amendment 41 in my name and in the names of my noble friend Lord Russell, the noble Baroness, Lady Harding, and the noble Lord, Lord Clement-Jones. The House can be forgiven if it is sensing a bit of déjà-vu, since I have proposed this clause once or twice before. However, since Committee, a couple of things have happened that make the argument for the code more urgent. We have now heard that the Prime Minister thinks that regulating AI is “leaning out” when we should be, as the tech industry likes to say, leaning in. We have had Matt Clifford’s review, which does not mention children even once. In the meantime, we have seen rollout of AI in almost all products and services that children use. In one of the companies—a household name that I will not mention—an employee was so concerned that they rang me to say that nothing had been checked except whether the platform would fall over.
Amendment 41 does not seek to solve what is a global issue of an industry arrogantly flying a little too close to the sun and it does not grasp how we could use this extraordinary technology and put it to use for humankind on a more equitable basis than the current extractive and winner-takes-all model; it is far more modest than that. It simply says that products and services that engage with kids should undertake a mandatory process that considers their specific vulnerabilities related to age. I want to stress this point. When we talk about AI, increasingly we imagine the spectre of diagnostic benefits or the multiple uses of generative models, but of course AI is not new nor confined to these uses. It is all around us and, in particular, it is all around children.
In 2021, Amazon’s AI voice assistant, Alexa, instructed a 10 year-old to touch a live electrical plug with a coin. Last year, Snapchat’s My AI gave adult researchers posing as a 13 year-old girl tips on how to lose her virginity with a 31 year-old. Researchers were also able to obtain tips on how to hide the smell of alcohol and weed and how to conceal Snapchat conversations from their parents. Meanwhile, character.ai is being sued by the mother of a 14 year-old boy in Florida who died by suicide after becoming emotionally attached to a companion bot that encouraged him to commit suicide.
In these cases, the companies in question responded by implementing safety measures after the fact, but how many children have to put their fingers in electrical sockets, injure themselves, take their own lives and so on before we say that those measures should be mandatory? That is all that the proposed code does. It asks that companies consider the ways in which their products may impact on children and, having considered them, take steps to mitigate known risk and put procedures in place to deal with emerging risks.
One of the frustrating things about being an advocate for children in the digital world is how much time I spend articulating avoidable harms. The sorts of solutions that come after the event, or suggestions that we ban children from products and services, take away from the fact that the vast majority of products and services could, with a little forethought, be places of education, entertainment and personal growth for children. However, children are by definition not fully mature, which puts them at risk. They chat with smart speakers, disclosing details that grown-ups might consider private. One study found that three to six year-olds believed that smart speakers have thoughts, feelings and social abilities and are more reliable than human beings when it came to answering fact-based questions.
I ask the Minister: should we ban children from the kitchen or living room in which the smart speaker lives, or demand, as we do of every other product and service, minimum standards of product safety based on the broad principle that we have a collective obligation to the safety and well-being of children? An AI code is not a stretch for the Bill. It is a bare minimum.
It will be clear to the ICO from the amendments that have been tabled and my comments that there is an expectation that it should take into account the discussion we have had on this Bill.
My Lords, I thank the Minister for his very considered response. In the same way as the noble Baroness, Lady Kidron, I take it that, effectively, the Minister is pledging to engage directly with us and others about the nature and contents of the code, and that the ICO will also engage on that. As the Minister knows, the definition of terms such as meaningful human engagement is something that we will wish to discuss and consider in the course of that engagement. I hope that the AI edtech code will also be part of that.
I thank the Minister. I know he has had to think about this quite carefully during the Bill’s passage. Currently, Clause 80 is probably the weakest link in the Bill, and this amendment would go some considerable way towards repairing it. My final question is not to the Minister, but to the Opposition: what on earth have they got against the UN? In the meantime, I beg leave to withdraw my amendment.
My Lords, Amendment 37 is on the subject of data adequacy, which has been a consistent issue throughout the passage of the Bill. The mechanism put forward in the amendment would review the question of data adequacy.
Safeguarding data adequacy is crucial for the UK’s economy and international partnerships. Losing data adequacy status would impose significant costs and administrative burdens on businesses and public sector organisations that share data between the UK and the EU. It would also hinder international trade and economic co-operation, and trust in the UK’s digital economy, contradicting the Government’s objective of economic growth. I hope very much that the Government are proactively engaging with the European Commission to ensure a smooth adequacy renewal process this year.
Early engagement and reassurance about the retention of adequacy status are of crucial importance, given the looming deadline of June this year. This includes explaining and providing reassurance regarding any planned data protection reforms, particularly concerning the independence of the Information Commissioner’s Office, ministerial powers to add new grounds—for instance, recognised legitimate interest for data processing —and the new provisions relating to automated decision-making.
Despite assurances from the noble Baroness, Lady Jones, that the proposed changes will not dilute data subjects’ rights or threaten EU adequacy, proactive engagement with the EU and robust safeguards are necessary to ensure the continued free flow of data while maintaining high data protection standards. The emphasis on proportionality as a safeguard against the dilution of data subjects’ rights, as echoed by the noble Baroness, Lady Jones, and the ICO, is insufficient. The lack of a clear definition of proportionality within the context of data subjects’ rights could provide loopholes for controllers and undermine the essential equivalence required for data adequacy. The Bill’s reliance on the ICO’s interpretation of proportionality without explicit legislative clarity could be perceived as inadequate by the European Commission, particularly in areas such as web scraping for AI training.
The reassurance that the Government are taking data adequacy seriously and are committing to engaging with the EU needs to be substantiated by concrete actions. The Government do not, it appears, disclose assessments and reports relating to the compatibility of the UK’s domestic data protection framework with the Council of Europe’s Convention 108+, and that raises further concerns about transparency and accountability. Access to this information would enable scrutiny and informed debate, ultimately contributing to building trust and ensuring compatibility with international data protection standards.
In conclusion, while the Government maintain that this Bill would not jeopardise data adequacy, the concerns raised by myself and others during its passage mean that I continue to believe that a comprehensive review of EU data adequacy, as proposed in Amendment 37, is essential to ensure the continued free flow of data, while upholding high data protection standards and maintaining the UK’s position as a trusted partner in international data exchange. I beg to move.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
My Lords, the noble and learned Lord, Lord Thomas, whose intervention I very much appreciated, particularly at this time of the evening, talked about a fresh pair of eyes. What kind of reassurance can the Minister give on that?
It is worth remembering that the ultimate decision is with the EU Commission and we are quite keen to have its eyes on it now, which is why we are engaging with it very carefully. It is looking at it as we are going through it—we are talking to it and we have dedicated teams of people brought together specifically to do this. There are several people from outside the direct construct of the Bill who are looking at this to make sure that we have adequacy and are having very direct conversations with the EU to ensure that that process is proceeding as we would wish it to.
I thank the Minister for his response. It would be very reassuring if it was our own fresh pair of eyes rather than across the North Sea. That is all I can say as far as that is concerned. I appreciate what he said—that the Government are taking this seriously. It is a continuing concern precisely because the chair of the European Affairs Committee wrote to the Government. It is a continuing issue for those of us observing the passage of the Bill and we will continue to keep our eyes on it as we go forward. I very much hope that June 2025 passes without incident and that the Minister’s predictions are correct. In the meantime, I beg leave to withdraw the amendment.
(2 weeks, 1 day ago)
Lords ChamberI thank my noble friend for that important question. Where there is evidence of non-compliance, Ofcom has set out that it will move quickly to enforcement, and that action will follow in spring this year, because companies will have had three months to get their positions sorted out—I think that 16 March is the date by which they have to do it. Ofcom will be able to apply fines, including global levies, and it will be able to apply to the courts for business disruption measures and have the flexibility to submit these applications urgently.
My Lords, the Minister’s response is somewhat baffling. Given the amendment to the Bill as it passed through the House, as a result of the amendment from the noble Baroness, Lady Morgan, it was quite clear that high-risk smaller platforms would be included in category 1 and bear all the consequences. Yet, despite the Secretary of State’s concerns, which were expressed in a letter last September, the Government have not insisted that Ofcom include those platforms in category 1. What does that mean? Why are the Government not taking proper legal advice and insisting that these smaller, high-risk platforms bear all the duties of category 1 services?
I thank the noble Lord for his question. Category 1, in the way that the Bill was ultimately approved, was for large sites with many users. The possibility remains that this threshold can be amended. It is worth remembering that category 1 imposes two additional duties: a duty that the company must apply its service agreements properly and a duty that users can make it possible for themselves not to see certain things. For many of the small and harmful sites, those things would not apply anyway, because users have gone there deliberately to see what is there, but the full force of the Act applies to those small companies, which is why there is a special task force to make sure that that is applied properly.
(2 weeks, 1 day ago)
Lords ChamberI welcome the Secretary of State’s Statement in this space, and I start with an apology. When I agreed to speak to this, I was told it would be first business after Questions, and I am afraid I have to leave for a flight midway through, so I apologise to noble Lords and hope that they understand. My colleague will be here all the way through.
As I say, we welcome the Statement and we welcome the Matt Clifford plan, which my noble friend Lord Camrose kicked off when he was leading these efforts in government, so we see this as a positive step forward. As Health Minister during that time, I saw first-hand the potential of AI, how it can really transform our services and how the UK really does have the potential for a leadership role.
Matt Clifford's plan, we believe, is right that the role of the Government in this is really to establish the foundations for growth: namely, making sure we have an AI-skilled workforce, the computing power, the energy needs to drive that computing power and the right regulatory framework. Then we use the assets we have, such as the data, to create the right datasets and use our public sector to help the rollout in many of them. I will focus my comments and questions on how we are going to make sure that those things happen.
Turning to the first one, the AI-skilled workforce, I must admit that when I read in the report that 5,000 AI jobs were being created in this, like most of us, I thought “5,000—that is great.” Then you realise that, actually, 4,500 of those are in construction and only 500 are in AI itself, and you start to get worried that maybe this is a bit style over substance. I am very keen to understand from the Minister here what we are specifically doing in this space. I am mindful, for instance, that we talk about having government develop training for the universities with a delivery or reporting date of autumn 2027. We all know how quickly AI is moving in this space, and we are saying we are just going to have the training in place for the universities to give these courses in two and a half years’ time. I think we all know that, in two and a half years’ time, the world will have moved on massively from that, and no doubt the training will be out of place. I hope the Minister can come back on that and give us some reassurances that we will actually have an accelerated process—I am afraid this will be a bit of a recurring theme.
On computing power, my noble friend Lord Camrose, when he was in government, had secured an £800 million commitment to build a supercomputer in Culham. Now I read, in the Government’s action plan, that they will
“start to develop the business case process”
for an AI computer. Unfortunately, like many noble Lords, I know what that means: a Treasury business case process, so you are talking about a year and a half to two years, at least. All I can guarantee is that, if you take that length of time to produce a business plan, whatever you were planning in terms of a supercomputer will be superseded by advancements and events. What is the Minister doing to streamline that business plan process and get action on this front so that we can get that new supercomputer fast?
On energy, we all accept there is a desperate need for energy; again, that is laid down in the action plan. The Government’s answer to that is to set up an AI energy quango. I think most of us would say that we need to set out what our energy needs require, but then surely it is up to the network or GB Energy to fulfil that. Why do we need another quango and another layer of bureaucracy? What powers is that quango going to have if it will not be commissioning these facilities, which I assume GB Energy will do?
On regulation and governance, the regulatory framework is another very important part of the foundation. I know the Government have plans for an AI Bill, but what is the timeline for it? Again—this is a recurrent theme—it needs to be quick so we can keep up with events.
Moving on to AI datasets, I know that this is something that the Minister is very keen on in the health space, as am I, being the former Health Minister responsible for this area. We have the best health data in the world; the beauty of having a National Health Service is that we have data on primary and secondary care going back to the Second World War. We have data coming in from the UK Biobank and other sources, such as retina scans from opticians which, we are hearing, can be used for stroke detection or maybe the early warning signs of dementia. There are fantastic opportunities for this, and we can already see its applications around the health service today. We have been doing the research with focus groups to bring the public with us on the use of their healthcare data. We have the potential to create the UK Silicon Valley in the life sciences on the back of the data that we have. We had in place a data for R&D programme, which was looking to utilise and create datasets in the health space. Could the Minister update us on where we are with that, and whether it is going to be his focus? As we discussed, that is something I would be very happy to work on together.
The last part of the foundations is to use the assets that we have in the public sector as a rollout plan for that and, again, health is a perfect place for this. We have seen brilliant applications already in cancer treatment and in overprescriptions; there are possibilities with the NHS app, which is really taking off, and to use AI in the 111 service to help triage; these are all fantastic opportunities. We put in place an NHS productivity plan which was very AI driven and AI heavy. Could the Minister update us on the AI productivity plan for the NHS and what progress we are making on it?
To conclude, we are very positive about the opportunities AI provides to transform the whole country’s economy and public services in ways that we cannot even imagine. However, it is businesses that need to drive this. It is the role of the Government to set the foundations to allow business to deliver; it is not the role of quangos, which are not going to deliver it. This area will need a Minister to drive it through and make it happen. Is the Minister the one who will do that? If he is, I give him all our support and wish him the best of luck with it.
My Lords, I also welcome this plan, perhaps with rather less baggage than the Conservative Benches. The Prime Minister and the Secretary of State invoked Babbage, Lovelace, Turing, the pioneering age of steam and even the white heat of the technological revolution, but at its core there is an important set of proposals with great potential. However, it is a wish list rather than a plan at present.
I particularly welcome the language in the plan around regulation, particularly where it refers to regulation assisting innovation, which is a change of tone. However, the plan and Statement raise many questions. In particular, how will the Government ensure that AI development mitigates risks beyond just safety to ensure responsible AI development and adoption, especially given the fact that a great deal of UK development will involve open-source applications?
On the question of the introduction of AI into the public sector, the Government are enormously enthusiastic. But, given their public sector digital transformation agenda, why are the Government watering down citizens’ rights in automated decision-making in the Data (Use and Access) Bill?
We welcome the recognition of the need to get the economic benefits for the UK from public sector data which may be used to develop AI models. What can the Minister tell us at this stage about what the national data library will look like? It is not clear that the Government yet know whether it will involve primary or secondary legislation or whatever. The plan and response also talk about “sovereign compute”, but what about sovereign cloud capability? The police cannot even find a supplier that guarantees its records will be stored in the UK.
While the focus on UK training is welcome, we must go beyond high-level skills. Not only are the tech companies calling out for technical skills, but AI is also shaping workplaces, services and lives. Will the Digital Inclusion Action Committee, chaired by the noble Baroness, Lady Armstrong, have a role in advising on this? Do the changes to funding and delivery expected for skills boot camps contribute to all of this?
On the question of energy requirements for the new data centres, will the new AI energy council be tasked with ensuring that they will have their own renewable energy sources? How will their location be decided, alongside that of the new AI growth centres?
The plan cannot be game-changing without public investment. It is about delivery, too, especially by the new sovereign data office; it cannot all be done with private sector investment. Where is the public money coming from, and over what timescale? An investment plan for compute is apparently to be married to the spending review; how does a 10-year timescale fit with this? I am very pleased that a clear role is identified for the Alan Turing Institute, but it is not yet clear what level of financial support it will get, alongside university research, exacompute capacity, and the British Business Bank in the spin-out/start-up pipeline support. What will the funding for the Compound Semiconductor Applications Catapult and the design and manufacturing ecosystem consist of?
The major negative in the plan for many of us, as the Minister already knows, is the failure to understand that our creative industries need to be able to derive benefits from their material used for training large language models. The plan ominously recommended reforming,
“the UK text and data mining regime so that it is at least as competitive as the EU”,
and the Government have stacked the cards in the consultation over this. We on these Benches and the creative industries will be fighting tooth and nail any new text and data mining exemption requiring opt-out.
My Lords, I anticipated that this Statement would attract interest from Members of this House, and I thank the noble Lords, Lord Markham and Lord Clement-Jones, for their comments and their broad welcoming of the report. I will try to respond to as many points as I can, but first I will reiterate the importance of this announcement.
Through the publication of the AI Opportunities Action Plan and the Government’s response, we are signalling that our ambition is high when it comes to embracing the opportunities presented by AI. This is a plan to exploit the economic growth that AI will bring and to drive forward the Government’s plan for change. Training the UK’s workforce is a key part of the plan, and there are steps with clear timelines as to when we will do that. I will come back to training a little later.
We need to diffuse AI technology across the economy and public services for better productivity and opportunity, and embrace the transformational impact it is going to have on everyday lives, from health and education to business and government services.
As has rightly been pointed out, AI is advancing at an extraordinary pace. That is why you will see in this response very tight timelines for actions. The one that was picked out on training, which is 2027, is only one part of the response; you will see that Skills England is due to report very shortly with the first phase of its recommendations and will follow that in autumn with further work. So most of the timelines are very tight, recognising the challenge that the pace of advancement in AI brings.
The benefits extend far beyond economic growth. It is the catalyst that we need for a public service revolution, including, of course, in the NHS. It will drive growth and innovation and deliver better outcomes for citizens. It also lies at the heart of two important missions for the Government: kick-starting economic growth and delivering an NHS fit for the future. By investing in AI now, we are ensuring that the UK is prepared to harness the transformational potential that undoubtedly exists. This will improve the quality and delivery of public services. The plan is a way to do that with real speed and ambition.
The issue of regulation has been raised and there is no doubt that the regulatory environment will be critical in driving trust and capitalising on the technology offers that arise. By bringing forward the recommendations in the plan, we will continue to support the AI Safety Institute and further develop the AI assurance ecosystem, including the small companies that will arise as a result, to increase trust in and adoption of AI.
The Government are committed to supporting regulators in evaluating their AI capabilities and understanding how they can be strengthened. Part of this is the role of the regulatory innovation office. The vast majority of AI should be regulated at the point of use by the expert regulators, but some relates to fast-evolving technology. That is why we will continue to deliver on manifesto commitments by placing binding requirements on the developers of the most powerful AI models. Those commitments will build on the work that has already been done at the Seoul and Bletchley AI safety summits and will be part of strengthening the role of the AI Safety Institute. This issue of making sure that we get the safety side of this right as we develop opportunities is of course key.
The question of copyright was raised by the noble Lord, Lord Clement-Jones, and I know that this is an extremely hot issue at the moment, which will be discussed many times over the next few days and weeks. The Government have issued a consultation, in which there are three principles: the owners of copyright should have control; there should be a mechanism to allow access to data to enable companies to develop their models in the UK, rather than elsewhere in the world; and, critically, there must be transparency. Where does the data flow and how can you work out the input from the output? Those three areas are a key part of the consultation and the consultation is crucial. We have a session planned for next week to go through this in some detail, and I invite and welcome all noble Lords to it, because getting this right will be important for the country. I look forward to discussing those proposals over the next few days and weeks.
Delivering the AI Opportunities Action Plan will require a whole-of-government effort. We are starting that work immediately to deliver on the commitments, build the foundations for AI growth, drive adoption across the economy and build UK capability. We are already expecting initial updates on a series of actions by this spring. For instance, DSIT will explore options for growing the domestic AI safety market and will provide a public update on this by spring this year.
Turning to some of the very specific points, I completely agree that training is crucial and we have to get it right. There are several recommendations and, as I said, the earliest will give a readout this spring. I do understand that this is not something that can wait until 2027; it has to start immediately.
It is important to lay out for the House the situation with compute. This spring, there will be access to two new major compute facilities for AI: Dawn in Cambridge and Isambard-AI in Bristol. When fully active this year, they will increase the AI compute facility something like thirtyfold, instantly. Those are the types of compute infrastructure that are needed. It is AI-specific compute infrastructure. It is not the case that the plan for the future starts now; it is happening now and those compute infrastructures will be used by academia, SMEs and others over the course of the year and beyond. The plan beyond that is to increase the compute infrastructure twentyfold by 2030. That requires a 10-year plan and for us to think into the future about what will be needed for us to be at the forefront of this. Exascale of course is different; it is being looked at as part of that, but it is not the same.
On energy, the noble Lord recognises that one of the most difficult things in government is to join up across departments. That is why it is important.
The national data library will be essential. I welcome the offer of help on health from the noble Lord, Lord Markham, and I will certainly take him up on that; this is an important area to look at. Noble Lords will be hearing much more about the national data library over the next few months. I completely agree that, as we develop this technology, we will need to ensure that citizens’ rights are properly protected. That is something that we will continue to discuss as part of the Data (Use and Access) Bill, among other issues.
Funding will be picked up; it is a fully funded programme, but then we will need to go into a spending review, as Governments always have to.
I will wrap up there to leave plenty of time for others to ask questions, but I hope that I have addressed some of the initial questions.
(2 months, 1 week ago)
Lords ChamberThe cost of launch has come down by something like 95%. The UK remains committed to getting a launch and remains committed to the space strategy as laid out.
My Lords, in that National Space Strategy, the previous Government focused on encouraging lower earth orbit satellites, which are increasingly contributing to the loss of dark skies, as we have heard. Will this Government focus on incentives for the development of higher-orbit satellites, such as geostationary satellites, particularly the micro versions, of which far fewer are needed? They offer the best cost economics, compared to LEO systems, and have a lower impact on the night sky.
The noble Lord makes an extremely important point about the size of satellites, which is one of the problems with the interference from both radio and optical imaging. The smaller satellites, which the UK is extremely good at making, will become an increasing part of the solution. On orbit, we have a commitment to low orbit through the OneWeb approach—where there are about 700 in low orbit—and to higher orbit where it is appropriate to do so.
(3 months ago)
Lords ChamberThe noble Lord knows that I know that unit extremely well. It is a very important unit globally and it was given an award of £30 million recently. The new model will allow for a longer period of funding—seven years plus seven years’ funding, so a total of 14 years—with a different process of evaluation, which is a lighter-touch, less bureaucratic process. There is no reason why there cannot be a similar number of trainees going through the new system.
My Lords, I declare an interest as chair of a university governing council. To some extent the Minister’s responses are reassuring, but is this part of a wider trend towards centralising decisions on research funding through UKRI? Are we moving towards a situation where the Government will fund research only within particular sectors set out in their industrial strategy? If that is the case, will that not stifle new research talent and innovation?
As the noble Lord may be aware, I have been very clear about the need for supporting basic curiosity-driven, investigator-led research, and I will remain resolute in that determination. Some of these new centres have specified areas, such as mental health and multi-morbidity, but there is a whole round which is unspecified, allowing for people to put forward ideas of their own for units of the future, which I believe will be important for the very reason the noble Lord says.
(6 months, 1 week ago)
Lords ChamberMy Lords, I refer to my interests in the register. I join in congratulating all the new Government Ministers and Whips on their appointments. As the DSIT spokesperson on these Benches, I give a particularly warm welcome to the noble Lord, Lord Vallance of Balham, and his excellent maiden speech. While he was the Government’s Chief Scientific Adviser, he was pivotal in setting up the Vaccine Taskforce and in organising the overall strategy for the UK’s development and distribution of Covid-19 vaccines, and we should all be eternally grateful for that.
I warmly welcome the noble Baroness, Lady Jones of Whitchurch, to her role. We have worked well together outside and then inside this House, and I very much want to constructively engage with both Ministers on the Government’s science and technology agenda. I also thank the noble Viscount, Lord Camrose, for his engagement when in the department, and for his courtesy and good humour throughout.
I welcome the Government’s agenda for growth through innovation, their mission to enhance public services through the deployment of new technology and DSIT’s central role in that, opening up what can be a blocked pipeline all the way from R&D to commercialisation—from university spin-out through start-up to scale-up and IPO. Crowding in and de-risking private investment through the national wealth fund, the British Business Bank and post-Mansion House pension reforms is crucial. Digital skills and digital literacy are also crucial but, to deploy digital tools successfully, we need a pipeline of creative critical thinking and collaboration skills as well.
In this context, I very much welcome the new Government’s tone on the value of universities, long-term financial settlements and resetting relations with Europe. I hope this means that we shall soon see whether spending plans for government R&D expenditure by 2030 and 2035 match their words. Disproportionately high overseas researcher visa costs must be lowered, as the noble Lord, Lord Vallance, knows.
But support for innovation should not be unconditional or at any cost, and I hope this Government will not fall into the trap of viewing regulation as necessarily the enemy of innovation. I therefore hope that the reference to AI legislation, but the failure to announce a Bill, is a mere timing issue. Perhaps we can hear later what the Government’s intention is in this respect. Before then, we are promised a product safety and metrology Bill, which could require alignment of AI-driven products with the EU AI Act. This seems to be putting the cart well in front of the regulatory horse.
We need to ensure that high-risk systems are mandated to adopt international ethical and safety standards. We all need to establish very clearly that generative AI systems need licences to ingest copyright material for training purposes, just as Mumsnet and the New York Times are asserting, and that there is an obligation of transparency in the use of datasets and original content. The Government in particular should lead the way in ensuring that there is a high level of transparency and opportunity for redress when algorithmic and automated systems are used in the public sector, and I commend my forthcoming Private Member’s Bill to them.
As regards the Bills in the King’s Speech, I look forward to seeing the details, but the digital information and smart data Bill seems to be heading in the right direction in the areas covered. I hope that other than a few clarifications, especially in research and on the constitution of the Information Commissioner’s Office, we are not going to exhume some of the worst areas of the old DPDI Bill, and that we have ditched the idea of a Brexit-EU divergence dividend by the watering down of so many data subjects’ rights. Will the Government give a firm commitment to safeguard our data adequacy with the EU? I hope that they will confirm that the intent of the reinstated digital verification provisions is not to have some form of compulsory national digital ID, but the creation of a genuine market in digital ID providers that give a choice to the citizen. I hope also that, in the course of that Bill, Ministers will meet LinesearchbeforeUdig and provide us all with much greater clarity around the proposals for the national underground asset register.
As for the cyber security and resilience Bill, events of recent days have demonstrated the need for cybersecurity, but have also made it clear that we are not just talking about threats from bad actors. There needs to be a rethink on critical national infrastructure such as cloud services and software, which are now essential public utilities.
Finally, I hope that we will see a long-awaited amendment of the Computer Misuse Act to include a statutory public defence, as called for by CyberUp, which was recommended by the Vallance report, as I recall. I very much hope that there will be no more Horizon scandals. I look forward to the Minister’s reply.