(3 days, 6 hours ago)
Lords ChamberMy Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.
Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.
The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.
The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.
I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.
I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.
These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.
No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.
These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.
Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.
As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.
Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.
In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.
The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.
The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.
I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.
I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.
That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.
My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.
The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.
The guidance is, however, contradictory. On the one hand, it says that cookie paywalls
“can be compliant with data protection law”
and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.
The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
(3 days, 6 hours ago)
Lords ChamberMy Lords, this has been a very interesting debate. I too congratulate the noble Baroness, Lady Owen, on having brought forward these very important amendments. It has been a privilege to be part of her support team and she has proved an extremely persuasive cross-party advocate, including in being able to bring out the team: the noble Baroness, Lady Kidron, the noble Lord, Lord Pannick, who has cross-examined the Minister, and the noble Lord, Lord Stevenson. There is very little to follow up on what noble Lords have said, because the Minister now knows exactly what he needs to reply to.
I was exercised by this rather vague issue of whether the elements that were required were going to come back at Third Reading or in the Commons. I did not think that the Minister was specific enough in his initial response. In his cross-examination, the noble Lord, Lord Pannick, really went through the key elements that were required, such as the no intent element, the question of reasonable excuse and how robust that was, the question of solicitation, which I know is very important in this context, and the question of whether it is really an international law matter. I have had the benefit of talking to the noble Lord, Lord Pannick, and surely the mischief is delivered and carried out here, so why is that an international law issue? There is also the question of deletion of data, which the noble Lord has explained pretty carefully, and the question of timing of knowledge of the offence having been committed.
The Minister needs to describe the stages at which those various elements are going to be contained in a government amendment. I understand that there may be a phasing, but there are a lot of assurances. As the noble Lord, Lord Stevenson, said, is it six or seven? How many assurances are we talking about? I very much hope that the Minister can see the sentiment and the importance we place on his assurances on these amendments, so I very much hope he is going to be able to give us the answers.
In conclusion, as the noble Baroness, Lady Morgan, said—and it is no bad thing to be able to wheel on a former Secretary of State at 9 o’clock in the evening—there is a clear link between gender-based violence and image-based abuse. This is something which motivates us hugely in favour of these amendments. I very much hope the Minister can give more assurance on the audio side of things as well, because we want future legislation to safeguard victims, improve prosecutions and deter potential perpetrators from committing image-based and audio-based abuse crimes.
I thank the Minister and my noble friend Lady Owen for bringing these amendments to your Lordships’ House. Before I speak to the substance of the amendments, I join others in paying tribute to the tenacity, commitment and skill that my noble friend Lady Owen has shown throughout her campaign to ban these awful practices. She not only has argued her case powerfully and persuasively but, as others have remarked, seems to have figured out the machinery of this House in an uncanny way. Whatever else happens, she has the full support of these Benches.
I am pleased that the Government have engaged constructively with my noble friend and are seeking to bring this back at Third Reading. The Minister has been asked some questions and we all look forward with interest to his responses. I know from the speeches that we have heard that I am not alone in this House in believing that we have an opportunity here and now to create these offences, and we should not delay. For the sake of the many people who have been, and will otherwise be, victims of the creation of sexually explicit deepfakes, I urge the Government to continue to work with my noble friend Lady Owen to get this over the line as soon as possible.
My Lords, I thank the noble Lord, Lord Bassam, for retabling his Committee amendment, which we did not manage to discuss. Sadly, it always appears to be discussed rather late in the evening, but I think that the time has come for this concept and I am glad that the Government are willing to explore it.
I will make two points. Many countries worldwide, including in the EU, have their own version of the smart fund to reward creators and performers for the private copy and use of their works and performances. Our own CMS Select Committee found that, despite the creative industries’ economic contribution—about which many noble Lords have talked—many skilled and successful professional creators are struggling to make a living from their work. The committee recommended that
“the Government work with the UK’s creative industries to introduce a statutory private copying scheme”.
This has a respectable provenance and is very much wanted by the collecting societies ALCS, BECS, Directors UK and DACS. Their letter said that the scheme could generate £250 million to £300 million a year for creatives, at no cost to the Government or to the taxpayer. What is not to like? They say that similar schemes are already in place in 45 countries globally, including most of Europe, and many of them include an additional contribution to public cultural funding. That could be totally game-changing. I very much hope that there is a fair wind behind this proposal.
My Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.
As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.
I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?
We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.
My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.
The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.
I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.
We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?
I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.
Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.
My Lords, we have had some discussion already this week on data centres. The noble Lord, Lord Holmes, is absolutely right to raise this broad issue, but I was reassured to hear from the noble Lord, Lord Hunt of Kings Heath, earlier in the week that the building of data centres, their energy requirements and their need may well be included in NESO’s strategic spatial energy plan and the centralised strategic network plan. Clearly, in one part of the forest there is a great deal of discussion about energy use and the energy needs of data centres. What is less clear and, in a sense, reflected in the opportunities plan is exactly how the Government will decide the location of these data centres, which clearly—at least on current thinking about the needs of large language models, AI and so on—will be needed. It is about where they will be and how that will be decided. If the Minister can cast any light on that, we would all be grateful.
I thank my noble friend Lord Holmes of Richmond for moving this amendment. Amendment 59 is an important amendment that addresses some of the key issues relating to large language models. We know that large language models have huge potential, and I agree with him that the Government should keep this under review. Perhaps the noble Baroness, Lady Jones of Whitchurch, would be willing to update the House on the Government’s policy on large language model regulation on her return.
Data centre availability is another emerging issue as we see growth in this sector. My noble friend is absolutely right to bring this to the attention of the House. We firmly agree that we will have a growing need for additional data centres. In Committee, the noble Baroness, Lady Jones, did not respond substantively to Amendments 60 and 66 from my noble friend on data centres, which I believe was—not wholly unreasonably—to speed the Committee to its conclusion just before Christmas. I hope the Minister can give the House a fuller response on this today, as it would be very helpful to hear what the Government’s plans are on the need for additional data centres.
My Lords, it is clear that Amendment 67 in the name of the noble Lord, Lord Lucas, is very much of a piece with the amendments that were debated and passed last week. On these Benches, our approach will be exactly the same. Indeed, we can rely on what the Minister said last week, when he gave a considerable assurance:
“I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important”.—[Official Report, 21/1/25; col. 1620.]
That is, the work of the Central Digital and Data Office. We are content to rely on his assurance.
I thank my noble friend Lord Lucas for bringing his Amendment 67, which builds on his previous work to ensure accuracy of data. On these Benches, we agree wholeheartedly with him that the information we have access to—for example, to verify documents—must be accurate. His amendment would allow the Secretary of State to make regulations establishing definitions under the Bill for the purposes of digital verification services, registers of births and deaths, and other provisions. Crucially, this would enable the Government to put measures in place to ensure the consistency of the definitions of key personal attributes, including sex. We agree that consistency and accuracy of data is vital. We supported him on the first day at Report, and, if he pushes his amendment to a Division, we will support him today.
My Lords, I too am lost in admiration for the noble Baroness, Lady Kidron—still firing on all cylinders at this time of night. Current law is clearly out of touch with the reality of computer systems. It assumes an untruth about computer reliability that has led to significant injustice. We know that that assumption has contributed to miscarriages of justice, such as the Horizon scandal.
Unlike the amendment in Committee, Amendment 68 does not address the reliability of computers themselves but focuses rather on the computer evidence presented in court. That is a crucial distinction as it seeks to establish a framework for evaluating the validity of the evidence presented, rather than questioning the inherent reliability of computers. We believe that the amendment would be a crucial step towards ensuring fairness and accuracy in legal proceedings by enabling courts to evaluate computer evidence effectively. It offers a balanced approach that would protect the interests of both the prosecution and the defence, ensuring that justice is served. The Government really must move on this.
I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.
The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.
My Lords, I have the very dubious privilege of moving the final amendment on Report to this Bill. This is a probing amendment and the question is: what does retrospectivity mean? The noble Lord, Lord Cameron of Lochiel, asked a question of the noble Baroness, Lady Jones, in Committee in December:
“Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold?”
She replied that
“the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively”.—[Official Report, 10/12/24; cols. GC 435-437.]
But the question is not really whether the lawfulness is retrospective, but whether the changes made in the new law can be applied to any personal data previously collected and already held on the commencement date of the Act—so that is the exam question.
It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.
We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.
I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.
Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.
I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.
I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.
(1 week, 3 days ago)
Lords ChamberMy Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.
(1 month, 1 week ago)
Grand CommitteeMy Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
(1 month, 2 weeks ago)
Grand CommitteeMy Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.
I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.
Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.
I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.
Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of
“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”
without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.
My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.
The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.
I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.
Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.
Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.
Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.
This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.
Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.
This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.
My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?
Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:
“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,
based on analysis of 13.1 million donors by the Salocin Group. The letter continues:
“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.
I hope that the Government will listen to the DMA and the charities involved.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
(1 month, 3 weeks ago)
Grand CommitteeMany thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.
(1 month, 4 weeks ago)
Grand CommitteeI thank the noble Lord, Lord Clement-Jones, for these amendments. Amendment 46 is about NUAR and the requirement to perform consultation first. I am not convinced that is necessary because it is already a requirement to consult under Clause 60 and, perhaps more pertinently, NUAR is an industry-led initiative. It came out of an industry meeting and has been led by them throughout. I am therefore not sure, even in spite of the requirement to consult, that much is going to come out of that consultation exercise.
In respect of other providers out there, LSBUD among them, when we were going through this exact debate in DPDI days, the offer I made—and I ask the Minister if she would consider doing the same—was to arrange a demonstration of NUAR to anyone who had not seen it. I have absolutely unshakeable confidence that anybody who sees NUAR in action will not want anything else. I am not a betting man, but—
For the record, the noble Viscount is getting a vigorous nod from the Minister.