(1 week, 4 days ago)
Lords ChamberI thank the right reverend Prelate for that important question. Trust is key to all this, and it is why we are committed to maintaining high standards of data protection in whichever context the AI system is deployed. The right reverend Prelate is quite right to raise the question of the NHS, where already AI is being used to read scans, to improve performance in terms of missed appointments and to advance pathology services, many of which are narrow AI uses which are extremely important.
My Lords, in opposition and in government, the party opposite has promised an AI Bill, but it continues to say very little about what it will do. This uncertainty is creating real challenges for AI labs and their customers, as well as for copyright holders and civil society groups. In short, everyone needs to feel more confident about the scope, the timing and the intentions of the Bill. What can the Minister say here and now to reassure us that there is actually a plan?
As the noble Viscount says, this is an urgent matter. A summit is going on in Paris at the moment discussing many of these issues. We remain committed to bringing forward legislation. We are continuing to refine the proposals and look forward to engaging extensively in due course to ensure that our approach is future-proofed and effective against what is a fast-evolving technology.
(3 weeks, 1 day ago)
Grand CommitteeI thank all noble Lords for their uniformly brilliant contributions to this important debate. I particularly thank the noble Lord, Lord Foster, for securing this debate and introducing it so powerfully. To start with a statement of the obvious: artificial intelligence can do us great good and great harm. I know we are hare mainly to avert the latter, but I open with a few thoughts on the former.
I should like to make two points in particular. First, the UK is often said to have a productivity problem and AI, even at its current level of capability, offers a great chance to fix this by automating routine tasks, improving decision-making and streamlining workflows. Secondly, it was often said, since the early days of e-commerce, that innovative use of technology was the preserve of the private sector, whereas the public sector was less nimble and consequently less productive. Those days must soon be over. Some of the best datasets, especially in this country, are public: health, education and geospatial in particular. Safely exploiting them will require close public-private collaboration, but if we are able to do so—and, I stress, do so safely—the productivity rewards will be extraordinary. This is why we, on these Benches, greatly welcome the AI action plan.
AI’s potential to revolutionise how we work and create is undeniable. In the creative industries, we have already seen its impact, with more than 38% of businesses incorporating AI technologies into their operations as of late last year. Whether in music, publishing, design or film, AI offers tools that enhance productivity, enable innovation, and open new markets. However, the key to all these prizes is public acceptance, the key to public acceptance is trustworthiness, and the key to trustworthiness is not permitting the theft of any kind of property, physical or intellectual.
This brings us to copyright and the rights of creators whose works underpin many of these advances. Copyright-protected materials are often used to train AI systems, too often without the permission, or even knowledge, of creators. Many persuasive and powerful voices push for laws, or interpretations of laws, in this country that prevent this happening. If we are able to create such laws, or such interpretations, I am all for them. I am worried, however, about creating laws we cannot enforce, because copyright can be enforced only if we know it has been infringed.
The size and the international distribution of AI training models render it extremely challenging to answer the two most fundamental questions, as I said on Tuesday. First, was a given piece of content used in a training model? Secondly, if so, in what jurisdiction did this take place? An AI lab determined to train a model on copyrighted content can do so in any jurisdiction of its choice. It may or may not choose to advise owners of scraped content, but my guess is that for a large model of 100 billion parameters, the lab might not be as assiduous in this as we would like. So, enforcement remains a significant challenge. A regulatory framework that lacks clear, enforceable protections risks being worse than ineffective in practice: it risks creating false confidence that eventually kills trust in, and public acceptance of, AI.
So, although we welcome the Government’s decision to launch a public consultation to address these challenges, it is vital that it leads to an outcome that does three things. First, needless to say, it must protect products of the mind from unlawful exploitation. Secondly, it must continue to allow AI labs to innovate, preferably in the UK. Thirdly, it must be enforceable. We all remember vividly Tuesday’s debate on Report of the DUA Bill. I worry that there is a pitfall in seeing AI and copyright policy as a zero-sum struggle between the first two of those objectives. I urge noble Lords, especially the Minister, to give equal emphasis and priority to all three of those goals.
I shall close with a few words on standards. As the Minister has rightly recognised, the key to an enforceable regime is internationally recognised technical standards, particularly, as I have argued, on digital watermarks to identify copyrighted content. A globally recognised, machine-readable watermark can alert scraping algorithms to copyrighted materials and alert rights holders to the uses of their materials. It may even allow rights holders to reserve their rights, opt out automatically or receive royalties automatically. In Tuesday’s debate, I was pleased to hear the Minister confirm that the Government will consider such standards as part of the consultation response.
Of course, the challenge here is that any such standards are—this is the bluntest possible way I can put it—either internationally observed and accepted or pointless. In this country, we have an opportunity to take the lead on creating them, just as we took the lead on setting standards for frontier AI safety in 2023 at Bletchley Park. I urge the Minister to strain every sinew to develop international standards. I say now that I and my party are most willing to support and collaborate on the development of such standards.
(3 weeks, 3 days ago)
Lords ChamberMy Lords, this has been a very interesting debate. I too congratulate the noble Baroness, Lady Owen, on having brought forward these very important amendments. It has been a privilege to be part of her support team and she has proved an extremely persuasive cross-party advocate, including in being able to bring out the team: the noble Baroness, Lady Kidron, the noble Lord, Lord Pannick, who has cross-examined the Minister, and the noble Lord, Lord Stevenson. There is very little to follow up on what noble Lords have said, because the Minister now knows exactly what he needs to reply to.
I was exercised by this rather vague issue of whether the elements that were required were going to come back at Third Reading or in the Commons. I did not think that the Minister was specific enough in his initial response. In his cross-examination, the noble Lord, Lord Pannick, really went through the key elements that were required, such as the no intent element, the question of reasonable excuse and how robust that was, the question of solicitation, which I know is very important in this context, and the question of whether it is really an international law matter. I have had the benefit of talking to the noble Lord, Lord Pannick, and surely the mischief is delivered and carried out here, so why is that an international law issue? There is also the question of deletion of data, which the noble Lord has explained pretty carefully, and the question of timing of knowledge of the offence having been committed.
The Minister needs to describe the stages at which those various elements are going to be contained in a government amendment. I understand that there may be a phasing, but there are a lot of assurances. As the noble Lord, Lord Stevenson, said, is it six or seven? How many assurances are we talking about? I very much hope that the Minister can see the sentiment and the importance we place on his assurances on these amendments, so I very much hope he is going to be able to give us the answers.
In conclusion, as the noble Baroness, Lady Morgan, said—and it is no bad thing to be able to wheel on a former Secretary of State at 9 o’clock in the evening—there is a clear link between gender-based violence and image-based abuse. This is something which motivates us hugely in favour of these amendments. I very much hope the Minister can give more assurance on the audio side of things as well, because we want future legislation to safeguard victims, improve prosecutions and deter potential perpetrators from committing image-based and audio-based abuse crimes.
I thank the Minister and my noble friend Lady Owen for bringing these amendments to your Lordships’ House. Before I speak to the substance of the amendments, I join others in paying tribute to the tenacity, commitment and skill that my noble friend Lady Owen has shown throughout her campaign to ban these awful practices. She not only has argued her case powerfully and persuasively but, as others have remarked, seems to have figured out the machinery of this House in an uncanny way. Whatever else happens, she has the full support of these Benches.
I am pleased that the Government have engaged constructively with my noble friend and are seeking to bring this back at Third Reading. The Minister has been asked some questions and we all look forward with interest to his responses. I know from the speeches that we have heard that I am not alone in this House in believing that we have an opportunity here and now to create these offences, and we should not delay. For the sake of the many people who have been, and will otherwise be, victims of the creation of sexually explicit deepfakes, I urge the Government to continue to work with my noble friend Lady Owen to get this over the line as soon as possible.
My Lords, I thank the noble Lord, Lord Bassam, for retabling his Committee amendment, which we did not manage to discuss. Sadly, it always appears to be discussed rather late in the evening, but I think that the time has come for this concept and I am glad that the Government are willing to explore it.
I will make two points. Many countries worldwide, including in the EU, have their own version of the smart fund to reward creators and performers for the private copy and use of their works and performances. Our own CMS Select Committee found that, despite the creative industries’ economic contribution—about which many noble Lords have talked—many skilled and successful professional creators are struggling to make a living from their work. The committee recommended that
“the Government work with the UK’s creative industries to introduce a statutory private copying scheme”.
This has a respectable provenance and is very much wanted by the collecting societies ALCS, BECS, Directors UK and DACS. Their letter said that the scheme could generate £250 million to £300 million a year for creatives, at no cost to the Government or to the taxpayer. What is not to like? They say that similar schemes are already in place in 45 countries globally, including most of Europe, and many of them include an additional contribution to public cultural funding. That could be totally game-changing. I very much hope that there is a fair wind behind this proposal.
My Lords, I thank the noble Lord, Lord Bassam of Brighton, for laying this amendment and introducing the debate on it.
As I understand it, a private copying levy is a surcharge on the price of digital content. The idea is that the money raised from the surcharge is either redistributed directly to rights holders to compensate them for any loss suffered because of copies made under the private copying exceptions or contributed straight to other cultural events. I recognise what the noble Lord is seeking to achieve and very much support his intent.
I have two concerns. First—it may be that I have misunderstood it; if so, I would be grateful if the noble Lord would set me straight—it sounds very much like a new tax of some kind is being raised, albeit a very small one. Secondly, those who legitimately pay for digital content end up paying twice. Does this not incentivise more illegal copying?
We all agree how vital it is for those who create products of the mind to be fairly rewarded and incentivised for doing so. We are all concerned by the erosion of copyright or IP caused by both a global internet and increasingly sophisticated AI. Perhaps I could modestly refer the noble Lord to my Amendment 75 on digital watermarking, which I suggest may be a more proportionate means of achieving the same end or at least paving the way towards it. For now, we are unable to support Amendment 57 as drafted.
I thank my noble friend Lord Bassam for his Amendment 57 on the subject of private copying levies. It reinforces a point we discussed earlier about copying being covered by copyright.
The smart fund campaign seeks the introduction of a private copy levy. Such a levy would aim to indirectly compensate copyright owners for the unauthorised private copying of their works—for example, when a person takes a photo of an artwork or makes a copy of a CD—by paying copyright owners when devices capable of making private copies are sold.
Noble Lords may be aware that, in April 2024, the Culture, Media and Sport Committee recommended that the Government introduce a private copying levy similar to that proposed by this amendment. The Government’s response to that recommendation, published on 1 November, committed the Intellectual Property Office to meet with representatives from the creative industries to discuss how to strengthen the evidence base on this issue. That process is under way. I know that a meeting with the smart fund group is planned for next week, and I can confirm that DCMS is included and invited. I know that the IPO would be glad to meet my noble friend, as well as the noble Lord, Lord Freyberg, and the noble Earl, Lord Clancarty, to discuss this further. I also absolutely assure him that Chris Bryant is aware of this important issue and will be following this.
I am sure my noble friend will agree that it is essential that we properly engage and consider the case for intervention before legislating. Therefore, I hope he will be content to withdraw his amendment, to allow the Government the opportunity to properly explore these issues with creative and tech industry stakeholders.
My Lords, all our speakers have made it clear that this is a here-and-now issue. The context has been set out by noble Lords, whether it is Stargate, the AI Opportunities Action Plan or, indeed, the Palantir contract with the NHS. This has been coming down the track for some years. There are Members on the Government Benches, such as the noble Lords, Lord Mitchell and Lord Hunt of Kings Heath, who have been telling us that we need to work out a fair way of deriving a proper financial return for the benefits of public data assets, and Future Care Capital has done likewise. The noble Lord, Lord Freyberg, has form in this area as well.
The Government’s plan for the national data library and the concept of sovereign data assets raises crucial questions about how to balance the potential benefits of data sharing with the need to protect individual rights, maintain public trust and make sure that we achieve proper value for our public digital assets. I know that the Minister has a particular interest in this area, and I hope he will carry forward the work, even if this amendment does not go through.
I thank the noble Baroness, Lady Kidron, for moving her amendment. The amendments in this group seek to establish a new status for data held in the public interest, and to establish statutory oversight rules for a national data library. I was pleased during Committee to hear confirmation from the noble Baroness, Lady Jones of Whitchurch, that the Government are actively developing their policy on data held in the public interest and developing plans to use our data assets in a trustworthy and ethical way.
We of course agree that we need to get this policy right, and I understand the Government’s desire to continue their policy development. Given that this is an ongoing process, it would be helpful if the Government could give the House an indication of timescales. Can the Minister say when the Government will be in a position to update the House on any plans to introduce a new approach to data held in the public interest? Will the Government bring a statement to this House when plans for a national data library proceed to the next stage?
I suggest that a great deal of public concern about nationally held datasets is a result of uncertainty. The Minister was kind enough to arrange a briefing from his officials yesterday, and this emerged very strongly. There is a great deal of uncertainty about what is being proposed. What are the mechanics? What are the risks? What are the costs? What are the eventual benefits to UK plc? I urge the Minister, as and when he makes such a statement, to bring a maximum of clarity about these fundamental questions, because I suspect that many people in the public will find this deeply reassuring.
Given the stage the Government are at with these plans, we do not think it would be appropriate to legislate at this stage, but we of course reserve the right to revisit this issue in the future.
I am grateful to the noble Baroness, Lady Kidron, and the noble Lord, Lord Tarassenko, for Amendments 58 and 71, one of which we also considered in Committee. I suspect that we are about to enter an area of broad agreement here. This is a very active policy area, and noble Lords are of course asking exactly the right questions of us. They are right to emphasise the need for speed.
I agree that it is essential that we ensure that legal and policy frameworks are fit for purpose for the modern demands and uses of data. This Government have been clear that they want to maximise the societal benefits from public sector data assets. I said in the House very recently that we need to ensure good data collection, high-quality curation and security, interoperability and ways of valuing data that secure appropriate value returns to the public sector.
On Amendment 58, my officials are considering how we approach the increased demand and opportunity of data, not just public sector data but data across our economy. This is so that we can benefit from the productivity and growth gains of improvements to access to data, and harness the opportunities, which are often greater when different datasets are combined. As part of this, we sought public views on this area as part of the industrial strategy consultation last year. We are examining our current approach to data licensing, data valuation and the legal framework that governs data sharing in the public sector.
Given the complexity, we need to do this in a considered manner, but we of course need to move quickly. Crucially, we must not betray the trust of people or the trust of those responsible for managing and safeguarding these precious data assets. From my time as chair of the Natural History Museum, I am aware that museums and galleries are considering approaches to this very carefully. The noble Lord, Lord Lucas, may well be interested to see some of the work going on on biodiversity datasets there, where there are huge collections of great value that we actually did put value against.
Of course, this issue cuts across the public sector, including colleagues from the Geospatial Commission, NHS, DHSC, National Archives, Department for Education, Ordnance Survey and Met Office, for example. My officials and I are very open to discussing the policy issues with noble Lords. I recently introduced the noble Lord, Lord Tarassenko, to officials from NHSE dealing with the data side of things there and linked him with the national data library to seek his input. As was referred to, yesterday, the noble Baroness, Lady Kidron, the noble Lords, Lord Clement-Jones, Lord Tarassenko and Lord Stevenson, and the noble Viscount, Lord Camrose, all met officials, and we remain open to continuing such in-depth conversations. I hope the noble Baroness appreciates that this is an area with active policy development and a key priority for the Government.
Turning to Amendment 71, also from the noble Baroness, I agree that the national data library represents an enormous opportunity for the United Kingdom to unlock the full value of our public data. I agree that the protection and care of our national data is essential. The scope of the national data library is not yet finalised, so it is not possible to confirm whether a new statutory body or specific statutory functions are the right way to do this. Our approach to the national data library will be guided by the principles of public law and the requirements of the UK’s data protection legislation, including the data protection principles and data subject rights. This will ensure that data sharing is fair, secure and preserves privacy. It will also ensure that we have clear mechanisms for both valuation and value capture. We have already sought, and continue to seek, advice from experts on these issues, including work from the independent Prime Minister’s Council for Science and Technology. The noble Lord, Lord Freyberg, also referred to the work that I was involved with previously at the Tony Blair Institute.
The NDL is still in the early stages of development. Establishing it on a statutory footing at this point would be inappropriate, as work on its design is currently under way. We will engage and consult with a broad range of stakeholders on the national data library in due course, including Members of both Houses.
The Government recognise that our data and its underpinning infrastructure is a strategic national asset. Indeed, it is for that reason that we started by designating the data centres as critical national infrastructure. As the subjects of these amendments remain an active area of policy development, I ask the noble Baroness to withdraw her amendment.
My Lords, we have had some discussion already this week on data centres. The noble Lord, Lord Holmes, is absolutely right to raise this broad issue, but I was reassured to hear from the noble Lord, Lord Hunt of Kings Heath, earlier in the week that the building of data centres, their energy requirements and their need may well be included in NESO’s strategic spatial energy plan and the centralised strategic network plan. Clearly, in one part of the forest there is a great deal of discussion about energy use and the energy needs of data centres. What is less clear and, in a sense, reflected in the opportunities plan is exactly how the Government will decide the location of these data centres, which clearly—at least on current thinking about the needs of large language models, AI and so on—will be needed. It is about where they will be and how that will be decided. If the Minister can cast any light on that, we would all be grateful.
I thank my noble friend Lord Holmes of Richmond for moving this amendment. Amendment 59 is an important amendment that addresses some of the key issues relating to large language models. We know that large language models have huge potential, and I agree with him that the Government should keep this under review. Perhaps the noble Baroness, Lady Jones of Whitchurch, would be willing to update the House on the Government’s policy on large language model regulation on her return.
Data centre availability is another emerging issue as we see growth in this sector. My noble friend is absolutely right to bring this to the attention of the House. We firmly agree that we will have a growing need for additional data centres. In Committee, the noble Baroness, Lady Jones, did not respond substantively to Amendments 60 and 66 from my noble friend on data centres, which I believe was—not wholly unreasonably—to speed the Committee to its conclusion just before Christmas. I hope the Minister can give the House a fuller response on this today, as it would be very helpful to hear what the Government’s plans are on the need for additional data centres.
My Lords, it is clear that Amendment 67 in the name of the noble Lord, Lord Lucas, is very much of a piece with the amendments that were debated and passed last week. On these Benches, our approach will be exactly the same. Indeed, we can rely on what the Minister said last week, when he gave a considerable assurance:
“I can be absolutely clear that we must have a single version of the truth on this. There needs to be a way to verify it consistently and there need to be rules. That is why the ongoing work is so important”.—[Official Report, 21/1/25; col. 1620.]
That is, the work of the Central Digital and Data Office. We are content to rely on his assurance.
I thank my noble friend Lord Lucas for bringing his Amendment 67, which builds on his previous work to ensure accuracy of data. On these Benches, we agree wholeheartedly with him that the information we have access to—for example, to verify documents—must be accurate. His amendment would allow the Secretary of State to make regulations establishing definitions under the Bill for the purposes of digital verification services, registers of births and deaths, and other provisions. Crucially, this would enable the Government to put measures in place to ensure the consistency of the definitions of key personal attributes, including sex. We agree that consistency and accuracy of data is vital. We supported him on the first day at Report, and, if he pushes his amendment to a Division, we will support him today.
My Lords, I too am lost in admiration for the noble Baroness, Lady Kidron—still firing on all cylinders at this time of night. Current law is clearly out of touch with the reality of computer systems. It assumes an untruth about computer reliability that has led to significant injustice. We know that that assumption has contributed to miscarriages of justice, such as the Horizon scandal.
Unlike the amendment in Committee, Amendment 68 does not address the reliability of computers themselves but focuses rather on the computer evidence presented in court. That is a crucial distinction as it seeks to establish a framework for evaluating the validity of the evidence presented, rather than questioning the inherent reliability of computers. We believe that the amendment would be a crucial step towards ensuring fairness and accuracy in legal proceedings by enabling courts to evaluate computer evidence effectively. It offers a balanced approach that would protect the interests of both the prosecution and the defence, ensuring that justice is served. The Government really must move on this.
I thank the noble Baroness, Lady Kidron, for her amendments. The reliability of computer-based evidence, needless to say, has come into powerful public focus following the Post Office Horizon scandal and the postmasters’ subsequent fight for justice. As the noble Baroness has said previously and indeed tonight, this goes far beyond the Horizon scandal. We accept that there is an issue with the way in which the presumption that computer evidence is reliable is applied in legal proceedings.
The Government accepted in Committee that this is an issue. While we have concerns about the way that the noble Baroness’s amendment is drafted, we hope the Minister will take the opportunity today to set out clearly the work that the Government are doing in this area. In particular, we welcome the Government’s recently opened call for evidence, and we hope Ministers will work quickly to address this issue.
Amendment 68 from the noble Baroness, Lady Kidron, aims to prevent future miscarriages of justice, such as the appalling Horizon scandal. I thank the noble Baroness and, of course, the noble Lord, Lord Arbuthnot, for the commitment to ensuring that this important issue is debated. The Government absolutely recognise that the law in this area needs to be reviewed. Noble Lords will of course be aware that any changes to the legal position would have significant ramifications for the whole justice system and are well beyond the scope of this Bill.
I am glad to be able to update the noble Baroness on this topic since Committee. On 21 January the Ministry of Justice launched a call for evidence on this subject. That will close on 15 April, and next steps will be set out immediately afterwards. That will ensure that any changes to the law are informed by expert evidence. I take the point that there is a lot of evidence already available, but input is also needed to address the concerns of the Serious Fraud Office and the Crown Prosecution Service, and I am sure they will consider the important issues raised in this amendment.
I hope the noble Baroness appreciates the steps that the Ministry of Justice has taken on this issue. The MoJ will certainly be willing to meet any noble Lords that wish to do so. As such, I hope she feels content to withdraw the amendment.
My Lords, I move Amendment 73 standing in my name which would require the Secretary of State to undertake a risk assessment on the data privacy risks associated with genomics and DNA companies that are headquartered in countries which the Government determine to be systemic competitors and hostile actors. The UK is a world leader in genomics research, and this a growing sector that makes an important contribution. The opportunities in genomics are enormous and we should take the steps needed to protect the UK’s leading role here.
I was pleased to hear from the noble Baroness, Lady Jones of Whitchurch, in Committee that:
“the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data”.
The Minister also gave the undertaking that the Government would
“brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year”.—[Official Report, 18/12/24; col. GC 124.]
I would be very grateful if the Minister could confirm whether the Joint Committee has been briefed and, if not, when that will happen.
I look forward to continuing to engage with Ministers on the issue of data security in the face of growing threats from international competitors and hostile actors.
I thank the noble Viscount, Lord Camrose, for giving me an opportunity to speak for 45 minutes on genomics, which I know everyone will be very grateful for. I shall resist that temptation and thank him for the amendment on security in genomic data.
As he is aware, the UK is a world leader in genomics, and its various datasets and studies have contributed to health globally. I also note that the UK Biological Security Strategy of 2023 has been endorsed by this Government and a variety of measures are under active consideration. I recognise the noble Viscount’s desire for quick movement on the issue and agree with him that this is of great importance. I reassure him that my officials are working at speed across government on this very issue. I would be very happy to brief him and other noble Lords present today on the findings of the risk assessment in due course. We have not yet engaged with the Joint Committee on National Security Strategy but will do shortly as per standard practice.
I hope that the noble Viscount will appreciate that this work is live and will grant a little patience on this issue. I look forward to engaging with him soon on this but, in the meantime, I would be grateful if he would withdraw his amendment.
I thank the Minister for his clear response and for taking pity on the House and not giving us the full benefit of his knowledge of genomics. Meanwhile, I recognise that we have to move with deliberateness here and not rush into the wrong solution. I gratefully accept his offer of further briefings and beg leave to withdraw my amendment.
My Lords, I have the very dubious privilege of moving the final amendment on Report to this Bill. This is a probing amendment and the question is: what does retrospectivity mean? The noble Lord, Lord Cameron of Lochiel, asked a question of the noble Baroness, Lady Jones, in Committee in December:
“Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold?”
She replied that
“the new lawful ground of recognised legitimate interest will apply from the date of commencement and will not apply retrospectively”.—[Official Report, 10/12/24; cols. GC 435-437.]
But the question is not really whether the lawfulness is retrospective, but whether the changes made in the new law can be applied to any personal data previously collected and already held on the commencement date of the Act—so that is the exam question.
It is indeed getting late. I thank the noble Lord, Lord Clement-Jones, for moving his amendment, and I really will be brief.
We do not oppose the government amendment in the name of the noble Lord, Lord Vallance. I think the Minister should be able to address the concerns raised by the noble Lord, Lord Clement-Jones, given that the noble Lord’s amendment merely seeks clarification on the retrospective application of the provisions of the Bill within a month of the coming into force of the Act. It seems that the Government could make this change unnecessary by clarifying the position today. I hope the Minister will be able to address this in his remarks.
I will speak first to Amendment 76. I reassure noble Lords that the Government do not believe that this amendment has a material policy effect. Instead, it simply corrects the drafting of the Bill and ensures that an interpretation provision in Clause 66 commences on Royal Assent.
Amendment 74, in the name of the noble Lord, Lord Clement Jones, would require the Secretary of State to publish a statement setting out whether any provisions in the Bill apply to controllers and processers retrospectively. Generally, provisions in Bills apply from the date of commencement unless there are strong policy or legal reasons for applying them retrospectively. The provisions in this Bill follow that general rule. For instance, data controllers will only be able to rely on the new lawful ground of recognised legitimate interests introduced by Clause 70 in respect of new processing activities in relation to personal data that take place after the date of commencement.
I recognise that noble Lords might have questions as to whether any of the Bill’s clauses can apply to personal data that is already held. That is the natural intent in some areas and, where appropriate, commencement regulations will provide further clarity. The Government intend to publish their plans for commencement on GOV.UK in due course and the ICO will also be updating its regulatory guidance in several key areas to help organisations prepare. We recognise that there can be complex lifecycles around the use of personal data and we will aim to ensure that how and when any new provisions can be relied on is made clear as part of the implementation process.
I hope that explanation goes some way to reassuring the noble Lord and that he will agree to withdraw his amendment.
(3 weeks, 3 days ago)
Lords ChamberMy Lords, I thank my noble friend Lord Holmes of Richmond for moving this amendment. I am sure we can all agree that the ICO should encourage and accommodate innovation. As I noted during the first day on Report, in a world where trade and business are ever more reliant on cross-border data transfers, data adequacy becomes ever more important.
In Committee, the noble Baroness, Lady Jones of Whitchurch, was able to give the House the reassurance that this Bill was designed with EU adequacy in mind. We were pleased to hear that the Government’s course of action is not expected to put this at risk. I also suggest that this Bill represents even less of a departure from GDPR than did its predecessor, the DPDI Bill.
We welcome the Government’s assurances, but we look to them to address the issues raised by my noble friend Lord Holmes. I think we can all agree that he has engaged constructively and thoughtfully on this Bill throughout.
I thank the noble Lord, Lord Holmes, for his Amendment 38 relating to the ICO’s innovation duty. I agree with his comments about the quality of our regulators.
I reiterate the statements made throughout the Bill debates that the Government are committed to the ongoing independence of the ICO as a regulator and have designed the proposals in the Bill with retaining EU adequacy in mind. The commissioner’s status as an independent supervisory authority for data protection is assured. The Information Commissioner has discretion over the application of his new duties. It will be for him to set out and justify his activities in relation to those duties to Parliament.
To answer the specific point, as well as that raised by the noble Lord, Lord Clement-Jones, considerations of innovations will not come at the expense of the commissioner’s primary objective to secure an appropriate level of protection for personal data. I hope that reassures the noble Lord.
My Lords, it is a pleasure to follow the noble Baroness, Lady Harding. I have added a few further words to my speech in response, because she made an extremely good point. I pay tribute to the noble Baroness, Lady Kidron, and her tenacity in trying to make sure that we secure a code for children’s data and education, which is so needed. The education sector presents unique challenges for protecting children’s data.
Like the noble Baronesses, Lady Kidron and Lady Harding, I look forward to what the Minister has to say. I hope that whatever is agreed is explicit; I entirely agree with the noble Baroness, Lady Harding. I had my own conversation with the Minister about Ofcom’s approach to categorisation which, quite frankly, does not follow what we thought the Online Safety Act was going to imply. It is really important that we absolutely tie down what the Minister has to say.
The education sector is a complex environment. The existing regulatory environment does not adequately address the unique challenges posed by edtech, as we call it, and the increasing use of children’s data in education. I very much echo what the noble Baroness, Lady Kidron, said: children attend school for education, not to be exploited for data mining. Like her, I cross over into considering the issues related to the AI and IP consultation.
The worst-case scenario is using an opt-in system that might incentivise learners or parents to consent, whether that is to state educational institutions such as Pearson, exam boards or any other entity. I hope that, in the other part of the forest, so to speak, that will not take place to the detriment of children. In the meantime, I very much look forward to what the Minister has to say on Amendment 44.
My Lords, I thank the noble Baroness, Lady Kidron, for moving her amendment. Before I begin, let me declare my interest as a recently appointed director of Lumi, an edtech provider—but for graduates, not for schools.
AI has the potential to revolutionise educational tools, helping teachers spend less time on marking and more time on face-to-face teaching with children, creating more innovative teaching tools and exercises and facilitating more detailed feedback for students. AI presents a real opportunity to improve education outcomes for children, opening more opportunities throughout their lives. There are deeply compelling promises in edtech.
However—there is always a however when we talk about edtech—creating and using AI education tools will require the collection and processing of children’s personal data. This potentially includes special category data—for instance, medical information pertaining to special educational needs such as dyslexia. Therefore, care must be taken in regulating how this data is collected, stored, processed and used. Without this, AI poses a major safeguarding risk. We share the concerns of the noble Baroness, Lady Kidron, and wholeheartedly support the spirit of her amendment.
We agree that it is prudent to require the ICO to make a code of practice on children’s data and education, and I particularly welcome a requirement on the ICO to consult with and involve parents. Parents know their children best, needless to say, and have their best interests at heart; their input will be critical in building trust in AI-assisted educational tools and facilitating their rollout and benefits for children throughout the UK.
However, as I said earlier at Report—and I shall not repeat the arguments now—we have concerns about the incorporation of international law into our law, and specifically, in this instance, the UN Convention on the Rights of the Child. We cannot therefore support the amendment as drafted. That said, we hope very much that the Government will listen carefully to the arguments raised here and take steps to introduce appropriate safeguards for children and young people in our data legislation regime. I suspect that most parents will greatly welcome more reassurance about the use of their children’s data.
I thank the noble Baroness, Lady Kidron, for raising this important topic today, and thank noble Lords for the impassioned speeches that we have heard. As my noble friend Lady Jones mentioned in Committee, the ICO has been auditing the practices of several edtech service providers and is due to publish its findings later this year. I am pleased to be able to give the noble Baroness, Lady Kidron, a firm commitment today that the Government will use powers under the Data Protection Act 2018 to require the ICO to publish a new code of practice addressing edtech issues.
The noble Baronesses, Lady Kidron and Lady Harding, both raised important points about the specificity, and I will try to address some of those. I am grateful to the noble Baroness for her suggestions about what the code should include. We agree that the starting point for the new code should be that children merit special protection in relation to their personal data because they may be less aware of the risks and their rights in relation to its processing. We agree that the code should include guidance for schools on how to comply with their controller duties in respect of edtech services, and guidance for edtech services on fulfilling their duties under the data protection framework—either as processors, controllers or joint controllers. We also agree that the code should provide practical guidance for organisations on how to comply with their so-called:
“Data protection by design and by default”
duties. This would help to ensure that appropriate technical and organisational measures are implemented in the development and operation of processing activities undertaken by edtech services.
The noble Baroness suggested that the new code should include requirements for the ICO to develop the code in consultation with children, parents, educators, children’s rights advocates, devolved Governments and industry. The commissioner must already consult trade associations, data subjects and persons who appear to the commissioner to represent the interest of data subjects before preparing a code, but these are very helpful suggestions. The development of any new code will also follow the new procedures introduced by Clause 92 of this Bill. The commissioner would be required to convene an expert panel to inform the development of the code and publish the draft code. Organisations and individuals affected by the code would be represented on the panel, and the commissioner would be required to consider its recommendations before publishing the code.
Beyond this, we do not want to pre-determine the outcome of the ICO’s audits by setting out the scope of the code on the face of the Bill now. The audits might uncover new areas where guidance is needed. Ensuring a clear scope for a code, grounded in evidence, will be important. We believe that allowing the ICO to complete its audits, so that the findings can inform the breadth and focus of the code, is appropriate.
The ICO will also need to carefully consider how its codes interrelate. For example, the noble Baroness suggested that the edtech code should cover edtech services that are used independently by children at home and the use of profiling to make predictions about a child’s attainment. Such processing activities may also fall within the scope of the age-appropriate design code and the proposed AI code, respectively. We need to give the ICO the flexibility to prepare guidance for organisations in a way that avoids duplication. Fully understanding the problems uncovered by the ICO audits will be essential to getting the scope and content of each code right and reducing the risk of unintended consequences.
To complement any recommendations that come from the ICO and its audits, the Department for Education will continue to work with educators and parents to help them to make informed choices about the products and services that they choose to support teaching and learning. The noble Baroness’s suggestion that there should be a certification scheme for approved edtech service providers is an interesting one that we will discuss with colleagues in the Department for Education. However, there might be other solutions that could help schools to make safe procurement decisions, and it would not be appropriate to use the ICO code to mandate a specific approach.
The point about schools and the use of work by children is clearly important; our measures are intended to increase the protections for children, not to reduce them. The Government will continue to work closely with noble Lords, the Department for Education, the ICO and the devolved regions as we develop the necessary regulations following the conclusion of the ICO audit. I hope that the noble Baroness is pleased with this commitment and as such feels content to withdraw her amendment.
My Lords, I can be pretty brief. We have had some fantastic speeches, started by the noble Baroness, Lady Kidron, with her superb rallying cry for these amendments, which we 100% support on these Benches. As she said, there is cross-party support. We have heard support from all over the House and, as the noble and learned Baroness, Lady Butler-Sloss, has just said, there has not been a dissenting voice.
I have a long association with the creative industries and with AI policy and yield to no one in my enthusiasm for AI—but, as the noble Baroness said, it should not come at the expense of the creative industries. It should not just be for the benefit of DeepSeek or Silicon Valley. We are very clear where we stand on this.
I pay tribute to the Creative Rights in AI Coalition and its campaign, which has been so powerful in garnering support, and to all those in the creative industries and creators themselves who briefed noble Lords for this debate.
These amendments respond to deep concerns that AI companies are using copyright material without permission or compensation. With the new government consultation, I do not believe that their preferred option is a straw man for a text and data mining exemption, with an opt out that we thought was settled under the previous Government. It starts from the false premise of legal uncertainty, as we have heard from a number of noble Lords. As the News Media Association has said, the Government’s consultation is based on a mistaken idea, promoted by tech lobbyists and echoed in the consultation, that there is a lack of clarity in existing copyright law. This is completely untrue. The use of copyrighted content without a licence by gen AI firms is theft on a mass scale and there is no objective case for a new text and data mining exception.
No effective opt-out system for the use of content by gen AI models has been proposed or implemented anywhere in the world, making the Government’s proposals entirely speculative. It is vital going forward that we ensure that AI companies cannot use copyrighted material without permission or compensation; that AI development does not exploit loopholes to bypass copyright laws; that AI developers disclose the sources of the data they use for training their models, allowing for accountability and addressing infringement; and that we reinforce the existing copyright framework, rather than creating new exceptions that disadvantage creators.
These amendments would provide a mechanism for copyright holders to contest the use of their work and ensure a route for payment. They seek to ensure that AI innovation does not come at the expense of the rights and livelihoods of creators. There is no market failure. We have a well-established licensing system as an alternative to the Government’s proposed opt-out scheme for AI developers using copyrighted works. A licensing system is the only sustainable solution that benefits both creative industries and the AI sector. We have some of the most effective collective rights organisations in the world. Licensing is their bread and butter. Merely because AI platforms are resisting claims, does not mean that the law in the UK is uncertain.
Amending UK law to address the challenges posed by AI development, particularly in relation to copyright and transparency, is essential to protect the rights of creators, foster responsible innovation and ensure a sustainable future for the creative industries. This should apply regardless of which country the scraping of copyright material takes place in, if developers market their product in the UK, regardless of where the training takes place. It would also ensure that AI start-ups based in the UK are not put at a competitive disadvantage due to the ability of international firms to conduct training in a different jurisdiction.
As we have heard throughout this debate, it is clear that the options proposed by the Government have no proper economic assessment underpinning them, no technology for an opt-out underpinning them and no enforcement mechanism proposed. It baffles me why the Conservative Opposition is not supporting these amendments, and I very much hope that the voices we have heard on the Conservative Benches will make sure that these amendments pass with acclamation.
I thank the noble Baroness, Lady Kidron, for moving this incredibly important group and all those speakers who have made the arguments so clearly and powerfully. I pay tribute to noble Baroness’s work on copyright and AI, which is so important for our arts and culture sector. As noble Lords have rightly said, our cultural industries make an enormous contribution to our country, not just in cultural terms but in economic ones, and we must ensure that our laws do not put that future at risk.
In the build-up to this debate I engaged with great pleasure with the noble Baroness, Lady Kidron, and on these Benches we are sympathetic to her arguments. Her Amendment 61 would require the Government to make regulations in this area. We accept the Government’s assurance that this is something they will seek to address, and I note the Minister’s confirmation that their consultation will form the basis of the Government’s approach to this issue. Given the importance of getting this right, our view is that the Government’s consultation is in mid-flight, and we have to allow it to do its work. Whatever view we take of the design and the timing of the consultation, it offers for now a way forward that will evidence some of the serious concerns expressed here. That said, we will take a great interest in the progress and outcomes of the consultation and will come back to this in future should the Government’s approach prove unsatisfactory.
Amendment 75 in my name also seeks to address the challenge that the growth in AI poses to our cultural industries. One of the key challenges in copyright and AI is enforceability. Copyright can be enforced only when we know it has been infringed. The size and the international distribution of AI training models render it extremely challenging to answer two fundamental questions today: first, was a given piece of content used in a training model and secondly, if so, in what jurisdiction did that use take place? If we cannot answer these questions, enforcement can become extremely hard, so a necessary, if not sufficient, part of the solution will be a digital watermark—a means of putting some red dye in the water where copyrighted material is used to train AIs. It could also potentially provide an automated means for content creators to opt out, with a vastly more manageable administrative burden.
I thank the Minister for his constructive engagement on digital watermarking and look to him to give the House an assurance that the Government will bring forward a plan to develop a technological standard for a machine-readable digital watermark. I hope that, if and when he does so, he is able to indicate both a timeline and an intention to engage internationally. Subject to receiving such reassurances when he rises, I shall not move my amendment.
I congratulate the noble Baroness, Lady Kidron, on her excellent speech. I know that she feels very strongly about this topic and the creative industries, as do I, but I also recognise what she said about junior Ministers. I have heard the many noble Lords who have spoken, and I hope they will forgive me if I do not mention everyone by name.
It is vital that we get this right. We need to give creators better, easier and practical control over their rights, allow appropriate access to training material by AI firms and, most importantly, ensure there is real transparency in the system, something that is currently lacking. We need to do this so that we can guarantee the continued success of our creative industries and fully benefit from what AI will bring.
I want to make it clear, as others have, that these two sectors are not mutually exclusive; it is not a case of picking sides. Many in the creative industries are themselves users or developers of AI technology. We want to ensure that the benefits of this powerful new technology are shared, which was a point made by the noble Baroness, Lady Stowell, and her committee.
It is obvious that these are complex issues. We know that the current situation is unsatisfactory in practice for the creative industries and the AI sector. That is why we have launched a detailed consultation on what package of measures can be developed to benefit both the creative industries and the AI sector. This is a genuine consultation. Many people from a range of sectors are engaging with us to share their views and evidence. It is important, and indeed essential, that we fully consider all responses provided in the consultation before we act. Not to do so would be a disservice to all those who are providing important input and would narrow our chance to get the right solution.
I agree wholeheartedly with the noble Baroness and many other noble Lords, including the noble Lord, Lord Freyberg, on the importance of transparency about the creative content used to train AI. Transparency, about both inputs and outputs, is a key objective in the Government’s consultation on copyright and AI. This very ability to provide transparency is at the centre of what is required. The consultation also contains two other vital objectives alongside transparency: practical and clear control and reward for rights holders over the use of their work. This is quite the opposite of the notion of giving away their hard work or theft. It is about increasing their control and ensuring access to data for AI training.
The Government certainly agree with the spirit of the amendments on transparency and web crawlers and the aims they are trying to achieve—that creators should have more clarity over which web crawlers can access their works and be able to block them if they wish, and that they should be able to know what has been used and by whom and have mechanisms to be appropriately reimbursed. However, it would be premature to commit to very specific solutions at this stage of the consideration of the consultation.
We want to consider these issues more broadly than the amendments before us, which do not take into account the fact that web crawling is not the only way AI models are trained. We also want to ensure that any future measures are not disproportionate for small businesses and individuals. There is a risk that legislating in this way will not be flexible enough to keep pace with rapid developments in the AI sector or new web standards. A key purpose of our consultation is to ensure that we have the full benefit of views on how to approach these issues, so that any legislation will be future-proof and able to deliver concrete and sustainable benefits for the creators. The preferred option in the consultation is one proposal; this is a consultation to try to find the right answer and all the proposals will be considered on their merits.
The Government are also committed to ensuring that rights holders have real control over how their works are used. At the moment, many feel powerless over the use of their works by AI models. Our consultation considers technological and other means that can help to ensure that creators’ wishes are respected in practice. We want to work with industry to develop simple and reliable ways to do this that meet agreed standards, in reference to the point made by the noble Viscount, Lord Camrose.
Technical standards are an important part of this. There are technical standards that will be required to prevent web crawlers accessing certain datasets. Standards will be needed for control at the metadata level and for watermarking. I agree with the noble Viscount, Lord Camrose, that standards on the use of watermarks or metadata could have a number of benefits for those who wish to control or license the use of their content with AI. Standards on the use of web crawlers may also improve the ability of rights holders to prevent the use of their works against their wishes. We will actively support the development of new standards and the application of existing ones. We see this as a key part of what is needed. We do not intend to implement changes in this area until we are confident that they will work in practice and are easy to use.
I also want to stress that our data mining proposals relate only to content that has been lawfully made available, so they will not apply to pirated copies. Existing copyright law will continue to apply to the outputs of AI models, as it does today. People will not be able to use AI as a cover for copyright piracy. With improved transparency and control over inputs, we expect that the likelihood of models generating infringing output will be greatly reduced.
My Lords, Amendment 46 seeks a review of court jurisdiction. As I said in Committee, the current system’s complexity leads to confusion regarding where to bring data protection claims—tribunals or courts? This is exacerbated by contradictory legal precedents from different levels of the judiciary, and it creates barriers for individuals seeking to enforce their rights.
Transferring jurisdiction to tribunals would simplify the process and reduce costs for individuals, and it would align with the approach for statutory appeals against public bodies, which are typically handled by tribunals. In the Killock v Information Commissioner case, Mrs Justice Farbey explicitly called for a “comprehensive strategic review” of the appeal mechanisms for data protection rights. That is effectively what we seek to do with this amendment.
In Committee, the noble Baroness, Lady Jones, raised concerns about transferring jurisdiction and introducing a new appeals regime. She argued that the tribunals lacked the capacity to handle complex data protection cases, but tribunals are, in fact, better suited to handle such matters due to their expertise and lower costs for individuals. Additionally, the volume of applications under Section 166—“Orders to progress complaints”—suggests significant demand for tribunal resolution, despite its current limitations.
The noble Baroness, Lady Jones, also expressed concern about the potential for a new appeal right to encourage “vexatious challenges”, but introducing a tribunal appeal system similar to the Freedom of Information Act could actually help filter out unfounded claims. This is because the tribunal would have the authority to scrutinise cases and potentially dismiss those deemed frivolous.
The noble Baroness, Lady Jones, emphasised the existing judicial review process as a sufficient safeguard against errors by the Information Commissioner. However, judicial review is costly and complex, presenting a significant barrier for individuals. A tribunal system would offer a much more accessible and less expensive avenue for redress.
I very much hope that, in view of the fact that this is a rather different amendment—it calls for a review—the Government will look at this. It is certainly called for by the judiciary, and I very much hope that the Government will take this on board at this stage.
I thank the noble Lord, Lord Clement-Jones, for moving his amendment, which would require the Secretary of State to review the potential impact of transferring to tribunals the jurisdiction of courts that relate to all data protection provisions. As I argued in Committee, courts have a long-standing authority and expertise in resolving complex legal disputes, including data protection cases, and removing the jurisdiction of the courts could risk undermining the depth and breadth of legal oversight required in such critical areas.
That said, as the noble Baroness, Lady Jones of Whitchurch, said in Committee, we have a mixed system of jurisdiction for legal issues relating to data, and tribunals have an important role to play. So, although we agree with the intentions behind the amendment from the noble Lord, Lord Clement-Jones, we do not support the push to transfer all data protection provisions from the courts to tribunals, as we believe that there is still an important role for courts to play. Given the importance of the role of the courts in resolving complex cases, we do not feel that this review is necessary.
My Lords, before the noble Viscount sits down, I wonder whether he has actually read the amendment; it calls for a review, not for transfer. I think that his speech is a carryover from Committee.
I thank the noble Lord, Lord Clement-Jones, for Amendment 46. It would require a review of the impact of transferring all data protection-related cases to the relevant tribunals. Currently there is a mixture of jurisdictions for tribunals and courts for data protection cases, depending on the nature of the proceedings. This is on the basis that certain claims are deemed appropriate for tribunal, while others are appropriate for courts, where stricter rules of evidence and procedure apply—for example, in dealing with claims by data subjects against controllers for compensation due to breaches of data protection legislation. As such, the current system already provides clear and appropriate administrative and judicial redress routes for data subjects seeking to exercise their rights.
Tribunals are in many cases the appropriate venue for data protection proceedings, including appeals by controllers against enforcement action or applications by data subjects for an order that the ICO should progress a complaint. Claims by individuals against businesses or other organisations for damages arising from breach of data protection law fall under the jurisdiction of courts rather than tribunals. This is appropriate, given the likely disparity between the resources of the respective parties, because courts apply stricter rules of evidence and procedures than tribunals. While court proceedings can, of course, be more costly, successful parties can usually recover their costs, which would not always be the case in tribunals.
I hope that the noble Lord agrees that there is a rationale for these different routes and that a review to consider transfer of jurisdictions to tribunals is therefore not necessary at this time.
My Lords, I support Amendments 47 and 48, which I was delighted to see tabled by the noble Lords, Lord Holmes and Lord Arbuthnot. I have long argued for changes to the Computer Misuse Act. I pay tribute to the CyberUp campaign, which has been extremely persistent in advocating these changes.
The CMA was drafted some 35 years ago—an age ago in computer technology—when internet usage was much lower and cybersecurity practices much less developed. This makes the Act in its current form unfit for the modern digital landscape and inhibits security professionals from conducting legitimate research. I will not repeat the arguments made by the two noble Lords. I know that the Minister, because of his digital regulation review, is absolutely apprised of this issue, and if he were able to make a decision this evening, I think he would take them on board. I very much hope that he will express sympathy for the amendments, however he wishes to do so—whether by giving an undertaking to bring something back at Third Reading or by doing something in the Commons. Clearly, he knows what the problem is. This issue has been under consideration for a long time, in the bowels of the Home Office—what worse place is there to be?—so I very much hope that the Minister will extract the issue and deal with it as expeditiously as he can.
I thank my noble friend Lord Holmes for tabling the amendment in this group. I, too, believe these amendments would improve the Bill. The nature of computing and data processing has fundamentally changed since the Computer Misuse Act 1990. Third parties hold and process immense quantities of data, and the means of accessing and interacting with that data have become unrecognisably more sophisticated. Updating the definition of unauthorised computer access through Amendment 48 is a sensible reform, as this new definition takes into account that data controllers and processors now hold substantial quantities of personal data. These entities are responsible for the security of the data they hold, so their provisions on access become legally relevant and this amendment reflects this.
When updating an offence, it is equally necessary to consider the legal defences, as my noble friend has rightly done in Amendment 47 by protecting individuals accessing information to detect or prevent a crime or whose actions are in the public interest. We on these Benches feel these amendments are wholly sensible. I urge the Minister to listen to the persuasive argument that my noble friend Lord Holmes has made and consider how we can deliver these improvements to our data legislation.
I am grateful to the noble Lord, Lord Holmes, for raising this topic through Amendments 47 and 48. I am very aware of this issue and understand the strength of feeling about reforming the Computer Misuse Act, as we have heard from the noble Lord, Lord Arbuthnot, and the noble Earl, Lord Erroll.
As the noble Lord, Lord Clement-Jones, rightly pointed out, when I was the Government Chief Scientific Adviser I conducted a review making recommendations on pro-innovation regulation of technologies and I made recommendations on the issues these amendments raise. These recommendations were accepted by the previous Government.
The Government are actively taking forward these recommendations as part of the Act’s ongoing review. These issues are, of course, complex and require careful consideration. The introduction of these specific amendments could unintentionally pose more risk to the UK’s cybersecurity, not least by inadvertently creating a loophole for cybercriminals to exploit to defend themselves against a prosecution.
Our engagement with stakeholders has revealed differing views, even among industry. While some industry partners highlight the noble Lord’s view that the Computer Misuse Act may prevent legitimate public interest activity, others have concerns about the unintended consequences. Law enforcement has considerable concerns that allowing unauthorised access to systems under the pretext of identifying vulnerabilities could be exploited by cybercriminals. Without robust safeguards and oversight, this amendment could significantly hinder investigations and place a burden on law enforcement partners to establish whether a person’s actions were in the public interest.
Further work is required to consider the safeguards that would need to accompany any introduction of statutory defences. The Government will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies on this issue. The Home Office will provide an update in due course, once the proposals have been finalised—or, in the words of the noble Lord, Lord Clement-Jones, they will pop out of the bowels of the Home Office in due course. With these reassurances in mind, I hope the noble Lord will feel able to withdraw his amendments.
My Lords, I will speak to Amendment 48B. In our view, cookie paywalls create an unfair choose for users, essentially forcing them to pay for privacy. We tabled an amendment in Committee to ban cookie paywalls, but in the meantime, as the noble Baroness, Lady Jones, heralded at the time, the Information Commissioner’s Office has provided updated guidance on the “consent or pay” model for cookie compliance. It is now available for review. This guidance clarifies how organisations can offer users a choice between accepting personalised ads for free access or paying for an ad-free experience while ensuring compliance with data protection laws. It has confirmed that the “consent or pay” model is acceptable for UK publishers, provided certain conditions are met. Key requirements for a valid consent under this model include: users must have genuine free choice; the alternative to consent—that is, payment—must be reasonably priced; and users must be fully informed about their options.
The guidance is, however, contradictory. On the one hand, it says that cookie paywalls
“can be compliant with data protection law”
and that providers must document their assessments of how it is compliant with DPL. On the other, it says that, to be compliant with data protection law, cookie paywalls must allow users to choose freely without detriment. However, users who do not wish to pay the fee to access a website will be subject to detriment, because with a cookie paywall they will pay a fee if they wish to refuse consent. This is addressed as the “power imbalance”. It is also worth noting that this guidance does not constitute legal advice; it leaves significant latitude for legal interpretation and argument as to the compatibility of cookie paywalls with data protection law.
The core argument against “consent or pay” models is that they undermine the principle of freely given consent. The ICO guidance emphasises that organisations using these models must be able to demonstrate that users have a genuine choice and are not unfairly penalised for refusing to consent to data processing for personalised advertising. Yet in practice, given the power imbalance, on almost every occasion this is not possible. This amendment seeks to ensure that individuals maintain control over their personal data. By banning cookie paywalls, users can freely choose not to consent to cookies without having to pay a fee. I very much hope that the Government will reconsider the ICO’s guidance in particular, and consider banning cookie paywalls altogether.
My Lords, I thank my noble friend Lord Lucas for introducing this group. Amendments 48A and 50A, in his name, would ensure that regulated professionals, including financial services firms, are able to comply with current and future regulatory requirements. The example my noble friend has given—the FCA’s expectation that firms communicate effectively with consumers—is a good one. Clearly, we must avoid a circumstance where regulators expect businesses to take action that is not possible due to limiting legislation governing data use and access. My noble friend has made a forceful case and I hope the Government will be able to give the House appropriate assurance that businesses will not be put in this position as a result of this legislation.
Amendment 48B, in the name of the noble Lord, Lord Clement-Jones, seeks to ban cookie paywalls. I opposed a similar amendment when we debated it in Committee as it actually seeks to curtail choice. Currently, users have the options to pay money and stay private, share personal data and read for free, or walk away. Faced with these options, for instance, I have sadly chosen to forgo my regular evening reading of the Daily Mail’s excellent sports pages, but I see no reason why that newspaper, or anyone else, should be compelled to provide anything for free. In fact, it has been very persuasively argued by Jaron Lanier, Shoshana Zuboff and many others that it is the fact that so much of the internet is apparently, but not actually, free that has caused a great deal of damage, rather than having an open charging model. This approach finally reveals the exact cash value of individuals’ data that websites are harvesting and offers users choice. We do not agree with attempts to remove that choice.
My Lords, I will start with Amendments 48A and 50A in the name of the noble Lord, Lord Lucas. The Government are aware that some financial services firms have raised concerns that the direct marketing rules in the privacy and electronic communications regulations prevent them supporting consumers in some instances. I appreciate the importance of the support that financial services firms provide to their customers to help them make informed decisions on matters such as their financial investments. The Government and the FCA are working closely together to improve the support available to consumers.
In December, the FCA launched an initial consultation on a new type of support for consumers with their investments and pensions called “targeted support”. Through this consultation, the FCA will seek feedback on any interactions of the proposals and direct marketing rules. As my noble friend Lady Jones explained in the debate in Grand Committee, firms can already provide service or regulatory communication messages to their customers without permission, provided these messages are neutral in tone, factual and do not include promotional content. Promotional content can be sent if a consumer consents to receiving direct marketing. Messages which are not directed to a particular individual, such as online adverts shown to everyone who views a website, are also not prevented by the rules. I hope this explanation and the fact that there is ongoing work provide some reassurance to the noble Lord, Lord Lucas, that the Government are actively looking into this issue, and that, as such, he is content to withdraw his amendment.
Amendment 48B from the noble Lord, Lord Clement-Jones, is aimed at banning cookie paywalls. These generally work by giving web users the option to pay for a cookie-free browsing experience. Many websites are funded by advertising, and some publishers think that people should pay for a viewing experience without personalised advertising. As he rightly pointed out, the ICO released updated guidance on how organisations can deploy “consent or pay” models while still ensuring that consent is “freely given”. The guidance is detailed and outlines important factors that organisations should consider in order to operate legally. We encourage businesses to read this guidance and respond accordingly.
I note the important points that the noble Lord makes, and the counterpoints made by the noble Viscount, Lord Camrose. The Government will continue to engage with businesses, the ICO and users on these models, and on the guidance, but we do not think there is currently a case for taking action to ban the practice. I therefore hope the noble Lord will not press his amendment.
I thank the noble Baroness, Lady Kidron, for introducing this group, and the noble Lord, Lord Clement-Jones, and the noble Earl, Lord Erroll, for their comments and contributions—particularly the salutary words of the noble Earl, Lord Erroll, on the role of the Executive here, which were very enlightening.
I agree with the noble Baroness, Lady Kidron, that Parliament should have the opportunity to scrutinise this secondary legislation. Online safety research is essential: as our lives become more and more digital, we must assess how it impacts us as people, and especially children, who are particularly vulnerable to online harms. This cannot be achieved unless researchers are able to access the unadulterated raw data. Therefore, I am sure that noble Lords—and our colleagues in the other place—would wish to scrutinise the legislation creating this access to ensure it is fit for purpose. This is why I support the spirit of Amendment 51.
Following on from this point, facilitating online harms research by making access requests enforceable under a pre-existing online safety regime, as per Amendment 52, certainly seems to me like a sensible measure. It would enable this vital research, as would Amendment 54, which removes the need to create a bespoke enforcement system for online safety research access.
Amendment 53 would also enable independent research into how online risks and harms impact different groups. This information would be extremely valuable to a broad range of stakeholders including social media platforms, data controllers, schools and parents and parliamentarians. It would help us all identify groups who are at heightened risk of online harm, what type of harm they are at risk of, which measures have reduced this risk, which have exacerbated it and what we can all do to reduce this danger.
There are many people undertaking online safety research across the globe and we should look to help these researchers access data for the purposes of safety research, even if their location is outside the UK. Of course, adequate safeguards would need to be in place, which may be dictated to some extent by the location of the researcher. However, online safety research is a benefit for all of us and Amendment 55 would keep barriers to this research to a minimum.
I am sure we would all like to think that all data holders and processors would wish to assist with prevention of online harms. However, where commercial and moral imperatives compete, we sadly cannot always count on the latter winning out. Therefore, Amendment 56 is a sensible addition that would prevent contractual exclusion of research access on online safety grounds, ensuring that online safety risks cannot be hidden or obscured.
I thank the noble Baroness, Lady Kidron, for the amendments on researchers’ access to data for online safety research, an incredibly important topic. It is clear from Committee that the Government’s proposals in this clause are broadly welcomed. They will ensure that researchers can access the vital data they need to undertake an analysis of online safety risks to UK users, informing future online safety interventions and keeping people safe online.
Amendment 51 would compel the Secretary of State to make regulations for a researcher access framework, and to do so within 12 months. While I am sympathetic to the spirit of the noble Baroness’s amendment, a fixed 12-month timescale and requirement to make regulations may risk compressing the time and options available to develop the most effective and appropriate solution, as my noble friend Lady Jones outlined in Committee. Getting this right is clearly important. While we are committed to introducing a framework as quickly as possible, we do not want to compromise its quality. We need adequate time to ensure that the framework is fit for purpose, appropriately safeguarded and future-proofed for a fast-evolving technological environment.
As required by the Online Safety Act, Ofcom is currently preparing a report into the ways in which researchers can access data and the barriers that they face, as well as exploring how additional access might be achieved. This report will be published in July of this year. We are also committed to conducting a thorough consultation on the issue prior to any enforceable requirements coming into force. The Government intend to consult on the framework as soon as practicable after the publication of Ofcom’s report this summer.
Sufficient time is required for a thorough consultation with the wide range of interested stakeholders in this area, including the research community, civil society and industry. I know that the noble Baroness raised a concern in Committee that the Government would rely on Ofcom’s report to set the framework for the regime, but I can assure her that a robust evidence-gathering process is already under way. The framework will be informed by collaboration with key stakeholders and formal consultation, as well as being guided by evidence from Ofcom’s report on the matter. Once all interested parties have had their say and the consultation is completed, the Government expect to make regulations to install the framework. It is right that the Government commit to a full consultation process and do not seek to prejudge the outcomes of that process by including a mandatory requirement for regulations now.
Amendment 53 would seek to expand the list of examples of the types of provision that the regulations might make. Clause 123 gives non-exhaustive examples of what may be included in future regulations; it certainly does not limit those regulations to the examples given. Given the central importance of protecting children and vulnerable users online, a key aim of any future regulations would be to support researchers to conduct research into the different ways that various groups of people experience online safety, without the need for this amendment. Indeed, a significant driving force for establishing this framework in the first place is to improve the quality of research that is possible to understand the risks to users online, particularly those faced by children. I acknowledge the point that the noble Baroness made about people of all ages. We would be keen to discuss this further with her as we consult on specific requirements as part of developing regulations.
I will touch on the point about legal privilege. We believe that routinely copying a lawyer on to all emails and documents is not likely to attract legal privilege. Legal privilege protects communication specifically between legal advisers and their clients being created for the purpose of giving or receiving legal advice, or for the sole or dominant purpose of litigation. It would not be satisfactory just to copy everyone on everything.
We are confident that we can draft regulations that will make it entirely clear that the legal right to data for research purposes cannot be avoided by tech companies seeking to rely on contractual provisions that purport to prevent the sharing of data for research purposes. Therefore, there is no need for a specific requirement in the Bill to override a terms of service.
(1 month ago)
Lords ChamberMy Lords, this is clearly box-office material, as ever.
I support Amendment 1 tabled by the noble Baroness, Lady Kidron, on inferred data. Like her, I regret that we do not have this Bill flying in tandem with an AI Bill. As she said, data and AI go together, and we need to see the two together in context. However, inferred data has its own dangers: inaccuracy and what are called junk inferences; discrimination and unfair treatment; invasions of privacy; a lack of transparency; security risks; predatory targeting; and a loss of anonymity. These dangers highlight the need for strong data privacy protection for consumers in smart data schemes and more transparent data collection practices.
Noble Lords will remember that Cambridge Analytica dealt extensively with inferred data. That company used various data sources to create detailed psychological profiles of individuals going far beyond the information that users explicitly provided. I will not go into the complete history, but, frankly, we do not want to repeat that. Without safeguards, the development of AI technologies could lead to a lack of public trust, as the noble Baroness said, and indeed to a backlash against the use of AI, which could hinder the Government’s ambitions to make the UK an AI superpower. I do not like that kind of boosterish language—some of the Government’s statements perhaps could have been written by Boris Johnson—nevertheless the ambition to put the UK on the AI map, and to keep it there, is a worthy one. This kind of safeguard is therefore extremely important in that context.
I start by thanking the noble Baroness, Lady Kidron, for introducing this group. I will speak particularly to the amendment in my name but before I do so, I want to say how much I agree with the noble Baroness and with the noble Lord, Lord Clement-Jones, that it is a matter of regret that we are not simultaneously looking at an AI Bill. I worry that this Bill has to take a lot of the weight that an AI Bill would otherwise take, but we will come to that in a great deal more detail in later groups.
I will address the two amendments in this group in reverse order. Amendment 5 in my name and that of my noble friend Lord Markham would remove Clause 13, which makes provision for the Secretary of State or the Treasury to give financial assistance to decision-makers and enforcers—that is, in essence, to act as a financial backstop. While I appreciate the necessity of guaranteeing the stability of enforcers who are public authorities and therefore branches of state, I am concerned that this has been extended to decision-makers. The Bill does not make the identity of a decision-maker clear. Therefore, I wonder who exactly we are protecting here. Unless those individuals or bodies or organisations can be clearly defined, how can we know whether we should extend financial assistance to them?
I raised these concerns in Committee and the Minister assured us at that time that smart data schemes should be self-financing through fees and levies as set out in Clauses 11 and 12 and that this provision is therefore a back-up plan. If that is indeed the case and we are assured of the self-funding nature of smart data schemes, then what exactly makes this necessary? Why must the statutory spending authority act as a backstop if we do not believe there is a risk it will be needed? If we do think there is such a risk, can the Minister elaborate on what it is?
I turn now to the amendment tabled by the noble Baroness, Lady Kidron, which would require data traders to supply customers with information that has been used by AI to build a profile on them. While transparency and explainability are hugely important, I worry that the mechanism proposed here will be too burdensome. The burden would grow linearly with the scale of the models used. Collating and supplying this information would, I fear, increase the cost of doing business for traders. Given AI’s potential to be an immense asset to business, helping generate billions of pounds for the UK economy—and, by the way, I rather approve of the boosterish tone and think we should strive for a great deal more growth in the economy—we should not seek to make its use more administratively burdensome for business. Furthermore, since the information is AI-generated, it is going to be a guess or an assumption or an inference. Therefore, should we require companies to disclose not just the input data but the intermediate and final outputs? Speaking as a consumer, I am not sure that I personally would welcome this. I look forward to hearing the Minister’s responses.
I thank the noble Baroness, Lady Kidron, and the noble Viscount, Lord Camrose, for their proposed amendments and continued interest in Part 1 of this Bill. I hope I can reassure the noble Baroness that the definition of customer data is purposefully broad. It encompasses information relating to a customer or a trader and the Government consider that this would indeed include inferred data. The specific data to be disclosed under a smart data scheme will be determined in the context of that scheme and I reassure the noble Baroness that there will be appropriate consultation before a smart data scheme is introduced.
I turn to Amendment 5. Clause 13 provides statutory authority for the Secretary of State or the Treasury to give financial assistance to decision-makers, enforcers and others for the purpose of meeting any expense in the exercise of their functions in the smart data schemes. Existing and trusted bodies such as sector regulators will likely be in the lead of the delivery of new schemes. These bodies will act as decision-makers and enforcers. It is intended that smart data schemes will be self-financing through the fees and levies produced by Clauses 11 and 12. However, because of the nature of the bodies that are involved, it is deemed appropriate for there to be a statutory spending authority as a backstop provision if that is necessary. Any spending commitment of resources will, of course, be subject to the usual estimates process and to existing public sector spending controls and transparency requirements.
I hope that with this brief explanation of the types of bodies involved, and the other explanations, the noble Baroness will be content to withdraw Amendment 1 and that noble Lords will not press Amendment 5.
My Lords, the noble Baroness, Lady Kidron, is setting a cracking pace this afternoon, and I am delighted to support her amendments and speak to them. Citizens should have the clear right to assign their data to data communities or trusts, which act as intermediaries between those who hold data and those who wish to use it, and are designed to ensure that data is shared in a fair, safe and equitable manner.
A great range of bodies have explored and support data communities and data trusts. There is considerable pedigree behind the proposals that the noble Baroness has put forward today, starting with a recommendation of the Hall-Pesenti review. We then had the Royal Society and the British Academy talking about data stewardship; the Ada Lovelace Institute has explored legal mechanisms for data stewardship, including data trusts; the Open Data Institute has been actively researching and piloting data trusts in the real world; the Alan Turing Institute has co-hosted a workshop exploring data trusts; and the Royal Society of Arts has conducted citizens’ juries on AI explainability and explored the use of data trusts for community engagement and outreach.
There are many reasons why data communities are so important. They can help empower individuals, give them more control over their data and ensure that it is used responsibly; they can increase bargaining power, reduce transaction costs, address data law complexity and protect individual rights; they can promote innovation by facilitating data-sharing; and they can promote innovation in the development of new products and services. We need to ensure responsible operation and build trust in data communities. As proposed by Amendment 43 in particular, we should establish a register of data communities overseen by the ICO, along with a code of conduct and complaint mechanisms, as proposed by Amendment 42.
It is high time we move forward on this; we need positive steps. In the words of the noble Baroness, Lady Kidron, we do not just seek assurance that there is nothing to prevent these data communities; we need to take positive steps and install mechanisms to make sure that we can set them up and benefit from that.
I thank the noble Baroness, Lady Kidron, for leading on this group, and the noble Lord, Lord Clement-Jones, for his valuable comments on these important structures of data communities. Amendments 2, 3, 4 and 25 work in tandem and are designed to enable data communities, meaning associations of individuals who have come together and wish to designate a third party, to act on the group’s behalf in their data use.
There is no doubt that the concept of a data community is a powerful idea that can drive innovation and a great deal of value. I thank the noble Lord, Lord Clement-Jones, for cataloguing the many groups that have driven powerful thinking in this area, the value of which is very clear. However—and I keep coming back to this when we discuss this idea—what prevents this being done already? I realise that this may be a comparatively trivial example, but if I wanted to organise a community today to oppose a local development, could I not do so with an existing lawful basis for data processing? It is still not clear in what way these amendments would improve my ability to do so, or would reduce my administrative burden or the risks of data misuse.
I look forward to hearing more about this from the Minister today and, ideally, as the noble Baroness, Lady Kidron, said, in a briefing on the Government’s plan to drive this forward. However, I remain concerned that we do not necessarily need to drive forward this mechanism by passing new legislation. I look forward to the Minister’s comments.
Amendment 42 would require the Information Commissioner to draw up a code of practice setting out how data communities must operate and how data controllers and processors should engage with these communities. Amendment 43 would create a register of data communities and additional responsibilities for the data community controller. I appreciate the intent of the noble Baroness, Lady Kidron, in trying to ensure data security and transparency in the operation of data communities. If we on these Benches supported the idea of their creation in this Bill, we would surely have to implement mechanisms of the type proposed in these amendments. However, this observation confirms us in our view that the administration required to operate these communities is starting to look rather burdensome. We should be looking to encourage the use of data to generate economic growth and to make people’s lives easier. I am concerned that the regulation of data communities, were it to proceed as envisaged by these amendments, might risk doing just the opposite. That said, I will listen with interest to the response of noble Lords and the Minister.
My Lords, I rise to speak to Amendments 2, 3, 4, 25, 42 and 43. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, for these amendments on data communities, which were previously tabled in Committee, and for the new clauses linking these with the Bill’s clauses on smart data.
As my noble friend Lady Jones noted in Committee, the Government support giving individuals greater agency over their data. The Government are strongly supportive of a robust regime of data subject rights and believe strongly in the opportunity presented by data for innovation and economic growth. UK GDPR does not prevent data subjects authorising third parties to exercise certain rights on their behalf. Stakeholders have, however, said that there may be barriers to this in practice.
I reassure noble Lords that the Government are actively exploring how we can support data intermediaries while maintaining the highest data protection standards. It is our intention to publish a call for evidence in the coming weeks on the activities of data intermediaries and the exercise of data subject rights by third parties. This will enable us to ensure that the policy settings on this topic are right.
In the context of smart data specifically, Part 1 of the Bill does not limit who the regulations may allow customers to authorise. Bearing in mind the IT and security-related requirements inherent in smart data schemes, provisions on who a customer may authorise are best determined in the context of a specific scheme, when the regulations are made following appropriate consultation. I hope to provide some additional reassurance that exercise of the smart data powers is subject to data protection legislation and does not displace data rights under that legislation.
There will be appropriate consultation, including with the Information Commissioner’s Office, before smart data schemes are introduced. This year, the Department for Business and Trade will be publishing a strategy on future uses of these powers.
While the smart data schemes and digital verification services are initial examples of government action to facilitate data portability and innovative uses of data, my noble friend Lady Jones previously offered a meeting with officials and the noble Baroness, Lady Kidron, to discuss these proposals, which I know my officials have arranged for next week—as the noble Baroness indicated earlier. I hope she is therefore content to withdraw her amendment.
My Lords, I very much support the amendments from the noble Lords, Lord Lucas and Lord Arbuthnot, particularly Amendment 6, about accuracy. It has become apparent—and Committee stage was interesting—that there is a challenge with having gender and sex as interchangeable. The problem becomes physical, because you cannot avoid the fact that you will react differently medically to certain things according to the sex you were born and to your DNA.
That can be very dangerous in two cases. The first case is where drugs or cures are being administered by someone who thinks they are treating a patient of one sex but they are actually a different sex. That could kill someone, quite happily. The second case is if you are doing medical research and relying on something, but then find that half the research is invalid because a person is not actually that sex but have decided to choose another gender. Therefore, all the research on that person could be invalid. That could lead to cures being missed, other things being diagnosed as being all right, and a lot of dangers.
As a society, we have decided that it will be all right for people to change gender—let us say that, as I think it is probably the easiest way to describe it. I do not see any problem with that, but we need critical things to be kept on records that are clearly separate. Maybe we can make decisions in Parliament, or wherever, about what you are allowed to declare on identity documents such as a passport. We need to have two things: one is sex, which is immutable, and therefore can help with all the other things behind the scenes, including research and treatments; the other is gender, which can be what you wish to declare, and society accepts that you can declare yourself as being of another gender. I cannot see any way round that. I have had discussions with people about this, and as one who would have said that this is quite wrong and unnecessary, I was convinced by the end of those discussions that it was right. Keeping the two separate in our minds would solve a lot of problems. These two amendments are vital for that.
I agree in many ways with the points from the noble Lord, Lord Clement-Jones. Just allowing some of these changes to be made by the stroke of a pen—a bit like someone is doing across the Atlantic—without coming to Parliament, is perhaps unwise sometimes. The combined wisdom of Parliament, looking at things from a different point of view, and possibly with a more societal point of view than the people who are trying to make systems work on a governmental basis, can be sensible and would avoid other mistakes being made. I certainly support his amendments, but I disagree entirely with his last statement where he did not support the noble Lords, Lord Lucas and Lord Arbuthnot.
I thank my noble friend Lord Lucas for introducing this group and for bringing these important and sometimes very difficult matters to the attention of the House. I will address the amendments slightly out of order, if I may.
For digital verification services to work, the information they have access to and use to verify documents must be accurate; this is, needless to say, critical to the success of the entire scheme. Therefore, it is highly sensible for Amendment 8 to require public authorities, when they disclose information via the information gateway, to ensure that it is accurate and reliable and that they can prove it. By the same measure, Amendment 6, which requires the Secretary of State to assess whether the public authorities listed are collecting accurate information, is equally sensible. These amendments as a pair will ensure the reliability of DVS services and encourage the industry to flourish.
I would like to consider the nature of accurate information, especially regarding an individual’s biological sex. It is possible for an individual to change their recorded sex on their driving licence or passport, for example, without going through the process of obtaining a gender recognition certificate. Indeed, a person can change the sex on their birth certificate if they obtain a GRC, but many would argue that changing some words on a document does not change the reality of a person’s genome, physical presentation and, in some cases, medical needs, meaning that the information recorded does not accurately relate to their sex. I urge the Minister to consider how best to navigate this situation, and to acknowledge that it is crucially important, as we have heard so persuasively from the noble Earl, Lord Errol, and my noble friends Lord Arbuthnot and Lord Lucas, that a person’s sex is recorded accurately to facilitate a fully functioning DVS system.
The DVS trust framework has the potential to rapidly transform the way identities and information are verified. It should standardise digital verification services, ensure reliability and build trust in the concept of a digital verification service. It could seriously improve existing, cumbersome methods of verifying information, saving companies, employers, employees, landlords and tenants time and money. Personally, I have high hopes of its potential to revolutionise the practices of recruitment. I certainly do not know many people who would say no to less admin. If noble Lords are minded to test the opinion of the House, we will certainly support them with respect to Amendments 6 and 8.
With the greatest respect to the noble Lord, Lord Clement-Jones, I think it is a mistake to regard this as part of some culture war struggle. As I understand it, this is about accuracy of data and the importance, for medical and other reasons, of maintaining accurate data.
All the benefits of DVS cannot be to the detriment of data privacy and data minimisation. Parliament is well-practised at balancing multiple competing concepts and doing so with due regard to public opinion. Therefore, Amendment 7 is indeed a sensible idea.
Finally, Amendment 9 would require the Secretary of State to review whether an offence of false use of identity documents created or verified by a DVS provider is needed. This is certainly worth consideration. I have no doubt that the Secretary of State will require DVS providers to take care that their services are not being used with criminal intent, and I am quite sure that DVS service providers do not want to facilitate crimes. However, the history of technology is surely one of high-minded purposes corrupted by cynical practices. Therefore, it seems prudent for the Secretary of State to conduct a review into whether creating this offence is necessary and, if it is, the best way that it can be laid out in law. I look forward to hearing the Minister’s comments on this and other matters.
I thank the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Arbuthnot, for their amendments and interest in the important area of digital verification services. I thank the noble Viscount, Lord Camrose, for his support for this being such an important thing to make life easier for people.
I will go in reverse order and start with Amendment 9. I thank the noble Lord, Lord Clement-Jones, for reconsidering his stance since Committee on the outright creation of these offences. Amendment 9 would create an obligation for the Secretary of State to review the need for digital identity theft offences. We believe this would be unnecessary, as existing legislation—for example, the Fraud Act 2006, the Computer Misuse Act 1990 and the Data Protection Act 2018—already addresses the behaviour targeted by this amendment.
However, we note the concerns raised and confirm that the Government are taking steps to tackle the issue. First, the Action Fraud service, which allows individuals to report fraud enabled by identity theft, is being upgraded with improved reporting tools, increased intelligence flows to police forces and better support services for victims. Secondly, the Home Office is reviewing the training offered to police officers who have to respond to fraud incidents, and identifying the improvements needed.
My Lords, I thank the Minister for these two technical amendments. I take this opportunity to thank him also for responding to correspondence about LinesearchbeforeUdig and its wish to meet government and work with existing services to deliver what it describes as the safe digging elements of the NUAR. The Minister has confirmed that the heavy lifting on this—not heavy digging—will be carried out by the noble Baroness, Lady Jones, on her return, which I am sure she will look forward to. As I understand it, officials will meet LinesearchbeforeUdig this week, and they will look at the survey carried out by the service. We have made some process since Committee, and I am grateful to the Minister for that.
My Lords, given that these are technical amendments, correcting wording errors, I have little to add to the remarks already made. We have no concerns about these amendments and will not seek to oppose the Government in making these changes.
My Lords, I will speak to Amendments 11 and 13 in my name and that of my noble friend Lord Markham. The national underground asset register contains the details of all underground assets and apparatus in England, Wales and Northern Ireland, or at any rate it will do as it goes forward. This includes water pipes, electricity cables, internet cables and fibres—details of the critical infrastructure necessary to sustain the UK as we know it.
Needless to say, there are many hostile actors who, if they got their hands on this information, would or could use it to commit appalling acts of terror. I am mindful of and grateful for the Government’s assurances given in Committee that it is and will be subject to rigorous security measures. However, the weakest link in cyber defence is often third-party suppliers and other partners who do not recognise the same level of risk. We should take every possible measure to ensure that the vital data in NUAR is kept safe and shared only with stakeholders who have the necessary security provisions in place.
For this reason, I have tabled Amendment 11, which would require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures which should be in place before they receive information from NUAR. I do not believe this would place a great burden on government departments, as appropriate cybersecurity standards already exist. The key is to ensure that they are duly observed.
I cannot overstate the importance of keeping this information secure, but I doubt noble Lords need much convincing on that score. Given how frighteningly high the stakes are, I strongly urge the most proactive possible approach to cybersecurity, advising stakeholders and taking every possible step to keep us all safe.
Amendment 13, also tabled in my name, requires the Registrar-General to make provisions to ensure the cybersecurity of the newly digitised registers of births, still-births, and deaths. There are a great many benefits in moving from a paper-based register of births and deaths to a digitised version. People no longer have to make the trip to sign the register in person, saving time and simplifying the necessary admin at very busy or very difficult points in people’s lives. It also reduces the number of physical documents that need to be maintained and kept secure. However, in digitising vast quantities of personal, valuable information, we are making a larger attack surface which will appeal to malign actors looking to steal personal data.
I know we discussed this matter in Committee, when the noble Baroness the Minister made the point that this legislation is more about a digitisation drive, in that all records will now be digital rather than paper and digital. While I appreciate her summary, I am not sure it addresses my concerns about the security risks of shifting to a purely digital model. We present a large and tempting attack surface, and the absence of paper back-ups increases the value of digital information even more, as it is the only register. Of course, there are already security measures in place for the digital copies of these registers. I have no doubt we have back-ups and a range of other fallback opportunities. But the same argument applies.
Proactive cybersecurity provisions are required, taking into account the added value of these registers and the ever-evolving threat we face from cybercriminals. I will listen with great interest to the thoughts of other noble Lords and the Minister.
My Lords, I thank the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, for these amendments. Clause 56 forms part of NUAR provisions. The security of NUAR remains of the utmost importance. Because of this, the Government have closely involved a wide range of security stakeholders in the development of NUAR, including the National Protective Security Authority and security teams from the asset owners themselves. Providing clear acceptable user and usage policies for any digital service is important. As such, we intend to establish clear guidance on the appropriate usage of NUAR, including what conditions end users must fulfil before gaining access to the service. This may include cybersecurity arrangements, as well as personal vetting. However, we do not feel it appropriate to include this in the Bill.
Care must be taken when disclosing platform-specific cybersecurity information, as this could provide bad actors with greater information to enable them to counter these measures, ultimately making NUAR less secure. Furthermore, regulations made in relation to access to information from NUAR would be subject to the affirmative procedure. As such, there will be future opportunities for relevant committees to consider in full these access arrangements, including, on an individual basis, any security impacts. I therefore reassure noble Lords that these measures will ensure that access to NUAR data is subject to appropriate safeguards.
I thank the Minister for his considered reply. It is clear that the Government and the department are taking the issue of security with all due seriousness. However, I remain concerned, particularly about the move to NUAR as a highly tempting attack service for malign actors. In light of this, I am minded to test the opinion of the House.
My Lords, we have heard some of the really consistent advocates for children’s online protection today. I must say that I had not realised that the opportunity of signing the amendments of the noble Baroness, Lady Kidron, was rather like getting hold of Taylor Swift tickets—clearly, there was massive competition and rightly so. I pay tribute not only to the speakers today but in particular to the noble Baroness for all her campaigning, particularly with 5Rights, on online child protection.
All these amendments are important for protecting children’s data, because they address concerns about data misuse and the need for heightened protection for children in the digital environment, with enhanced oversight and accountability in the processing of children’s data. I shall not say very much. If the noble Baroness pushes Amendment 20 to a vote, I want to make sure that we have time before the dinner hour to do so, which means going through the next group very quickly. I very much hope that we will get a satisfactory answer from the Minister. The sage advice from the noble Lord, Lord Stevenson, hit the button exactly.
Amendment 20 is particularly important in this context. It seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A. As the noble Baroness explains, that means that personal data originally collected from a child with consent for a specific purpose could not be reused for a different, incompatible purpose without obtaining fresh consent, even if the child is now an adult. In my view, that is core. I hope the Minister will come back in the way that has been requested by the noble Lord, Lord Stevenson, so we do not have to have a vote. However, we will support the noble Baroness if she wishes to test the opinion of the House.
My Lords, I too thank the noble Baroness, Lady Kidron, for all her amendments in this group, and I thank the Minister for his amendment.
Amendment 15 seeks to maintain the high level of legal protection for children’s data even where protections for adults may be eased in the context of scientific research. I acknowledge the concerns raised about the potential implications that this amendment could have for medical research and safeguarding work. It is important to recognise that young people aged 16 and over are entitled to control their medical information under existing legal frameworks, reflecting their ability to understand and consent in specific contexts.
There is a legitimate concern that by excluding all children categorically, including those aged 16 and 17, we risk impeding critical medical research that could benefit young people themselves. Research into safeguarding may also be impacted by such an amendment. Studies that aim to improve systems for identifying and preventing abuse or neglect rely on the careful processing of children’s data. If this amendment were to inadvertently create a barrier to such vital work, we could find ourselves undermining some of the protections that it seeks to reinforce.
That said, the amendment highlights an important issue: the need to ensure that ethical safeguards for children remain robust and proportionate. There is no question that the rights and welfare of children should remain paramount in research contexts, but we must find the right balance—one that allows valuable, ethically conducted research to continue without eroding the legal protections that exist for children’s data. So I welcome the intent of the amendment in seeking to protect children, of course, and I urge us, as the noble Lord, Lord Stevenson, put it, to continue working collaboratively to achieve a framework that upholds their rights without hindering progress in areas that ultimately serve their best interests.
As with the previous amendment, I recognise the intent of Amendment 16, which seeks to protect children’s data by excluding them from the scope of recognised legitimate interests. Ensuring that children continue to benefit from the highest level of legal protection is a goal that, needless to say, we all share. However, I remain concerned that this could have less desirable consequences too, particularly in cases requiring urgent safeguarding action. There are scenarios where swift and proportionate data processing is critical to protecting a child at risk, and it is vital that the framework that we establish does not inadvertently create barriers to such essential work.
I am absolutely in support of Amendment 20. It provides an important safeguard by ensuring that children’s data is not used for purposes beyond those for which it was originally collected, unless it is fully compatible with the original purpose. Children are particularly vulnerable when it comes to data processing and their understanding of consent is limited. The amendment would strengthen protection for children by preventing the use of their data in ways that were not made clear to them or their guardians at the time of collection. It would ensure that children’s data remained secure and was not exploited for unrelated purposes.
On Amendment 22, the overarching duty proposed in this new clause—to prioritise children’s best interests and ensure that their data is handled with due care and attention—aligns with the objective that we all share of safeguarding children in the digital age. We also agree with the principle that the protections afforded to children’s data should not be undermined or reduced, and that those protections should remain consistent with existing standards under the UK GDPR.
However, although we support the intent of the amendment, we have concerns about the reference to the UN Convention on the Rights of the Child and general comment 25. Although these international frameworks are important, we do not believe they should be explicitly tied into this legislation. Our preference would be for a redraft of this provision that focused more directly on UK law and principles, ensuring that the protections for children’s data were robust and tailored to our legal context, rather than linking it to international standards in a way that could create potential ambiguities.
(1 month ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Clement-Jones, for raising these significant issues. While I share some of the concerns expressed, I find myself unable—at least for the moment—to offer support for the amendments in their current form.
Amendment 17 seeks to remove the powers granted to the Secretary of State to override primary legislation and to modify aspects of UK data protection law via statutory instrument. I agree with the principle underpinning this amendment: that any changes to data protection law must be subject to appropriate scrutiny. It is essential that parliamentary oversight remains robust and meaningful, particularly when it comes to matters as sensitive and far-reaching as data protection.
However, my hesitation lies in the practical implications of the amendment. While I sympathise with the call for greater transparency, I would welcome more detail on how this oversight mechanism might work in practice. Would it involve enhanced scrutiny procedures or a stronger role for relevant parliamentary committees? I fear that, without this clarity, we risk creating uncertainty in an area that requires, above all, precision and confidence.
The Minister’s Amendment 18 inserts specific protections for children’s personal data into the UK GDPR framework. The Government have rightly emphasised the importance of safeguarding children in the digital age. I commend the intention behind the amendment and agree wholeheartedly that children deserve special protections when it comes to the processing of their personal data.
It is worth noting that this is a government amendment to their own Bill. While Governments amending their own legislation is not unprecedented—the previous Government may have indulged in the practice from time to time—it is a practice that can give rise to questions. I will leave my comments there; obviously it is not ideal, but these things happen.
Finally, Amendment 21, also tabled by the noble Lord, Lord Clement-Jones, mirrors Amendment 17 in seeking to curtail the Secretary of State’s powers to amend primary legislation via statutory instrument. My earlier comments on the importance of parliamentary oversight apply here. As with Amendment 17, I am of course supportive of the principle. The delegation of such significant powers to the Executive should not proceed without robust scrutiny. However, I would appreciate greater clarity on how this proposed mechanism would function in practice. As it stands, I fear that the amendment raises too many questions. If these concerns could be addressed, I would be most grateful.
In conclusion, these amendments raise important points about the balance of power between the Executive and Parliament, as well as the protection of vulnerable individuals in the digital sphere. I look forward to hearing more detail and clarity, so that we can move forward with confidence.
My Lords, government Amendment 18 is similar to government Amendment 40 in the previous group, which added an express reference to children meriting specific protection to the new ICO duty. This amendment will give further emphasis to the need for the Secretary of State to consider the fact that children merit specific protection when deciding whether to use powers to amend the list of recognised legitimate interests.
Turning to Amendment 17 from the noble Lord, Lord Clement-Jones, I understand the concerns that have been raised about the Secretary of State’s power to add or vary the list of recognised legitimate interests. This amendment seeks to remove the power from the Bill.
In response to some of the earlier comments, including from the committees, I want to make it clear that we have constrained these powers more tightly than they were in the previous data Bill. Before making any changes, the Secretary of State must consider the rights and freedoms of individuals, paying particular attention to children, who may be less aware of the risks associated with data processing. Furthermore, any addition to the list must meet strict criteria, ensuring that it serves a clear and necessary public interest objective as described in Article 23.1 of the UK GDPR.
The Secretary of State is required to consult the Information Commissioner and other stakeholders before making any changes, and any regulations must then undergo the affirmative resolution procedure, guaranteeing parliamentary scrutiny through debates in both Houses. Retaining this regulation-making power would allow the Government to respond quickly if future public interest activities are identified that should be added to the list of recognised legitimate interests. However, the robust safeguards and limitations in Clause 70 will ensure that these powers are used both sparingly and responsibly.
I turn now to Amendment 21. As was set out in Committee, there is already a relevant power in the current Data Protection Act to provide exceptions. We are relocating the existing exemptions, so the current power, so far as it relates to the purpose limitation principle, will no longer be relevant. The power in Clause 71 is intended to take its place. In seeking to reassure noble Lords, I want to reiterate that the power cannot be used for purposes other than the public interest objectives listed in Article 23.1 of the UK GDPR. It is vital that the Government can act quickly to ensure that public interest processing is not blocked. If an exemption is misused, the power will also ensure that action can be swiftly taken to protect data subjects by placing extra safeguards or limitations on it.
My Lords, as we reach the end of this important group, I thank particularly my noble friend Lady Harding for her contribution and detailed account of some of the issues being faced, which I found both interesting and valuable. I thought the example about the jazz concert requiring the combination of those different types of data was very illuminating. These proposed changes provide us the opportunity to carefully balance economic growth with the fundamental right to data privacy, ensuring that the Bill serves all stakeholders fairly.
Amendment 24 introduces a significant consideration regarding the use of the open electoral register for direct marketing purposes. The proposal to include data from the OER, combined with personal data from other sources, to build marketing profiles creates a range of issues that require careful consideration.
Amendment 24 stipulates that transparency obligations must be fulfilled when individuals provide additional data to a data provider, and that this transparency should be reflected both in the privacy policy and via a data notification in a direct mail pack. While there is certainly potential to use the OER to enhance marketing efforts and support economic activity, we have to remain vigilant to the privacy implications. We need to make sure that individuals are informed of how and where their OER data is being processed, especially when it is combined with other data sources to build profiles.
The requirement for transparency is a positive step, but it is essential that these obligations are fully enforced and that individuals are not left in the dark about how their personal information is being used. I hope the Minister will explain a little more about how these transparency obligations will be implemented in practice and whether additional safeguards are proposed.
Amendment 49 introduces a change to Regulation 22, creating an exception for charities to use electronic mail for direct marketing in specific circumstances. This amendment enables charities to send direct marketing emails when the sole purpose is to further one or more of their charitable purposes, provided that certain conditions are met. These conditions include that the charity obtained the recipient’s contact details when the individual expressed interest in the charity or offered previous support for the charity. This provision recognises the role of charities in fundraising and that their need to communicate with volunteers, supporters or potential donors is vital for their work.
However, I understand the argument that we must ensure that the use of email marketing does not become intrusive or exploitative. The amendment requires that recipients are clearly informed about their right to refuse future marketing communications and that this option is available both when the data is first collected and with every subsequent communication. This helps strike the right balance between enabling charities to raise funds for their causes and protecting individuals from unwanted marketing.
I welcome the Government’s commitment to ensuring that charities continue to engage with their supporters while respecting individuals’ right to privacy. However, it is essential that these safeguards are robustly enforced to prevent exploitation. Again, I look forward to hearing from the Minister on how the Government plan to ensure that their provisions will be properly implemented and monitored.
Amendment 50 introduces the concept of soft opt-ins for email marketing by charities, allowing them to connect with individuals who have previously expressed interest in their charitable causes. This can help charities maintain and grow their supporter base but, again, we must strike the right balance with the broader impact this could have on people in receipt of this correspondence. It is crucial that any system put in place respects individuals’ right to privacy and their ability to opt out easily. We must ensure that charities provide a clear, simple and accessible way for individuals to refuse future communications, and that this option is consistently available.
Finally, we should also consider the rules governing the use of personal data by political parties. This is, of course, an area where we must ensure that transparency, accountability and privacy are paramount. Political parties, like any other organisation, must be held to the highest standards in their handling of personal data. I hope the Government can offer some clear guidance on improving and strengthening the rules surrounding data use by political parties to ensure that individuals’ rights are fully respected and protected.
My Lords, I rise to speak to Amendments 26, 31 and 32 tabled in my name and that of my noble friend Lord Markham. I will address the amendments in reverse order.
Amendment 32 would ensure that, where a significant decision is taken by ADM, the data subject was able to request intervention by a human with sufficient competency and authority. While that is clearly the existing intent of the ADM provisions in the Bill, this amendment brings further clarity. I am concerned that, where data processors update their ADM procedures in the light of this Bill, it should be abundantly clear to them at every stage what the requirements are and that, as currently written, there may be a risk of misunderstanding. Given the significance of decisions that may be made by ADM, we should make sure this does not happen. Data subjects must have recourse to a person who both understands their problem and is able to do something about it. I look forward to hearing the Minister’s views on this.
Amendment 31 would require the Secretary of State to provide guidance on how consent should be obtained for ADM involving special category data. It would also ensure that this guidance was readily available and reviewed frequently. The amendment would provide guidance for data controllers who wish to use ADM, helping them to set clear processes for obtaining consent, thus avoiding complaints and potential litigation.
We all know that litigation can be slow, disruptive and sometimes prohibitively expensive. If we want to encourage the use of ADM so that customers and businesses can save both time and money, we should seek to ensure that the sector does not become a hotbed of litigation. The risk can be mitigated by providing ample guidance for the sector. For relatively minimal effort on the part of the Secretary of State, we may be able to facilitate substantial growth in the use and benefits of ADM. I would be most curious to hear the Minister’s opinions on this matter and, indeed, the opinions of noble Lords more broadly.
Amendment 26 would insert the five principles set out in the AI White Paper published by the previous Government, requiring all data controllers and processors who partake in AI-driven ADM to have due regard for them. In the event that noble Lords are not familiar with these principles, they are: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real, and popular, safeguards against the risks of AI while continuing to foster innovation.
There is a requirement. Going back to the issue of principles, which was discussed earlier on, one of the existing principles—which I am now trying to locate and cannot—is transparency. I expect that we would make as much of the information public as we can in order to ensure good decision-making and assure people as to how the decisions have been reached.
I thank all noble Lords and the Minister for their comments and contributions to what has been a fascinating debate. I will start by commenting on the other amendments in this group before turning to those in my name.
First, on Amendments 28 and 29, I am rather more comfortable with the arrangements for meaningful human intervention set out in the Bill than the noble Lord, Lord Clement-Jones. For me, either a decision has meaningful human intervention or it does not. In the latter case, certain additional rights kick in. To me, that binary model is clear and straightforward, and could only be damaged by introducing some of the more analogue concepts such as “predominantly”, “principally”, “mainly” or “wholly”, so I am perfectly comfortable with that as it is.
However, I recognise that puts a lot of weight on to the precise meaning of “meaningful human involvement”. Amendment 36 in the name of the noble Lord, Lord Clement-Jones, which would require the Secretary of State to produce a definition of “meaningful human involvement” in ADM in collaboration with the ICO, seems to take on some value in those circumstances, so I am certainly more supportive of that one.
As for Amendments 34 and 35 in the names of the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Freeman, I absolutely recognise the value and potential of efficacy; I agree it is a very valuable term. I have more faith in the rollout and use of the ATRS but on a non-statutory basis, believing, as I do, that this would allow it to continue to develop in an agile and adaptive manner. I welcome the Minister’s words on this subject, and for now I remain comfortable that the ATRS is the direction forward for that.
I turn to the amendments in my name. I thank all noble Lords and, indeed, the Minister for their comments and contributions regarding Amendments 31 and 32. I very much take the Minister’s point that definitions of consent feature elsewhere in the Bill. That reduces my concern somewhat.
However, I continue to strongly commend Amendment 26 to the House. I believe it will foster innovation while protecting data rights. It is popular with the public and with private sector stakeholders. It will bring about outcomes that we all want to see in AI safety without stifling this new and exciting technology. In the absence of an AI Bill—and possibly even in the presence of one—it is the only AI-specific legislation that will be around. It is important somehow to get those AI principles in the Bill, at least until an AI Bill comes along. With this in mind, I wish to test the opinion of the House.
My Lords, I will speak very briefly, given the hour, just to reinforce three things that I have said as the wingman to the noble Baroness, Lady Kidron, many times, sadly, in this Chamber in child safety debates. The age-appropriate design code that we worked on together and which she championed a decade ago has driven real change. So we have evidence that setting in place codes of conduct that require technology companies to think in advance about the potential harms of their technologies genuinely drives change. That is point one.
Point two is that we all know that AI is a foundational technology which is already transforming the services that our children use. So we should be applying that same principle that was so hard fought 10 years ago for non-AI digital to this foundational technology. We know that, however well meaning, technology companies’ development stacks are always contended. They always have more good things that they think they can do to improve their products for their consumers, that will make them money, than they have the resources to do. However much money they have, they just are contended. That is the nature of technology businesses. This means that they never get to the safety-by-design issues unless they are required to. It was no different 150 or 200 years ago as electricity was rolling through the factories of the mill towns in the north of England. It required health and safety legislation. AI requires health and safety legislation. You start with codes of conduct and then you move forward, and I really do not think that we can wait.
My Lords, Amendment 41 aims to establish a code of practice for the use of children’s data in the development of AI technologies. In the face of rapidly advancing AI, it is, of course, crucial that we ensure children’s data is handled with the utmost care, prioritising their best interests and fundamental rights. We agree that AI systems that are likely to impact children should be designed to be safe and ethical by default. This code of practice will be instrumental in guiding data controllers to ensure that AI development and deployment reflect the specific needs and vulnerabilities of children.
However, although we support the intent behind the amendment, we have concerns, which echo concerns on amendments in a previous group, about the explicit reference to the UN Convention on the Rights of the Child and general comment 25. I will not rehearse my comments from earlier groups, except to say that it is so important that we do not have these explicit links to international frameworks, important as they are, in UK legislation.
In the light of this, although we firmly support the overall aim of safeguarding children’s data in AI, we believe this can be achieved more effectively by focusing on UK legal principles and ensuring that the code of practice is rooted in our domestic context.
I thank the noble Lord, Lord Clement-Jones, for Amendment 33, and the noble Baroness, Lady Kidron, for Amendment 41, and for their thoughtful comments on AI and automated decision-making throughout this Bill’s passage.
The Government have carefully considered these issues and agree that there is a need for greater guidance. I am pleased to say that we are committing to use our powers under the Data Protection Act to require the ICO to produce a code of practice on AI and solely automated decision-making through secondary legislation. This code will support controllers in complying with their data protection obligations through practical guidance. I reiterate that the Government are committed to this work as an early priority, following the Bill receiving Royal Assent. The secondary legislation will have to be approved by both Houses of Parliament, which means it will be scrutinised by Peers and parliamentarians.
I can also reassure the noble Baroness that the code of practice will include guidance about protecting data subjects, including children. The new ICO duties set out in the Bill will ensure that where children’s interests are relevant to any activity the ICO is carrying out, it should consider the specific protection of children. This includes when preparing codes of practice, such as the one the Government are committing to in this area.
I understand that noble Lords will be keen to discuss the specific contents of the code. The ICO, as the independent data protection regulator, will have views as to the scope of the code and the topics it should cover. We should allow it time to develop those thoughts. The Government are also committed to engaging with noble Lords and other stakeholders after Royal Assent to make sure that we get this right. I hope noble Lords will agree that working closely together to prepare the secondary legislation to request this code is the right approach instead of pre-empting the exact scope.
The noble Lord, Lord Clement-Jones, mentioned edtech. I should add—I am getting into a habit now—that it is discussed in a future group.
I have added my name to this amendment, about which the noble Lord, Lord Clement-Jones, has spoken so eloquently, because of the importance to our economic growth of maintaining data adequacy with the EU. I have two points to add to what he said.
First, as I said and observed on some occasions in Committee, this is legislation of unbelievable complexity. It is a bad read, except if you want a cure for insomnia. Secondly, it has the technique of amending and reamending earlier legislation. Thirdly, this is not the time to go into detail of the legal problems that arise, some of which we canvassed in Committee, as to whether this legislation has no holes in it. I do not think I would be doing any favours either to the position of the United Kingdom or to those who have been patient enough to stay and listen to this part of the debate by going into any of those in any detail, particularly those involving the European Convention on Human Rights and the fundamental charter. That is my first point, on the inherent nature of the legislative structure that we have created. As I said earlier, I very much hope we will never have such legislation again.
Secondly, in my experience, there is a tendency among lawyers steeped in an area or department often to feel, “Well, we know it’s all right; we built it. The legislation’s fine”. Therefore, there is an additional and important safeguard that I think we should adopt, which is for a fresh pair of eyes, someone outside the department or outside those who have created the legislation, to look at it again to see whether there are any holes in it. We cannot afford to go into this most important assessment of data adequacy without ensuring that our tackle is in order. I appreciate what the Minister said on the last occasion in Committee—it is for the EU to pick holes in it—but the only prudent course when dealing with anything of this complexity in a legal dispute or potential dispute is to ensure that your own tackle is in order and not to go into a debate about something without being sure of that, allowing the other side to make all the running. We should be on top of this and that is why I very much support this amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones—as ever—and the noble and learned Lord, Lord Thomas, for tabling Amendment 37 in their names. It would introduce a new clause that would require the Secretary of State to carry out an impact assessment of this Act and other changes to the UK’s domestic and international frameworks relating to data adequacy before the European Union’s reassessment of data adequacy in June this year.
I completely understand the concerns behind tabling this amendment. In the very worst-case scenario, of a complete loss of data adequacy in the assessment by the EU, the effect on many businesses and industries in this country would be knocking at the door of catastrophic. It cannot be allowed to happen.
However, introducing a requirement to assess the impact of the Bill on the European Union data adequacy decision requires us to speculate on EU intentions in a public document, which runs the risk of prompting changes on its part or revealing our hand to it in ways that we would rather not do. It is important that we do two things: understand our risk, without necessarily publishing it publicly; and continue to engage at ministerial and official level, as I know we are doing intensively. I think the approach set out in this amendment runs the risk of being counterproductive.
I thank the noble Lord, Lord Clement-Jones, for his amendment, and the noble and learned Lord, Lord Thomas, for his contribution. I agree with them on the value and importance placed on maintaining our data adequacy decisions from the EU this year. That is a priority for the Government, and I reassure those here that we carefully considered all measures in the light of the EU’s review of our adequacy status when designing the Bill.
The Secretary of State wrote to the House of Lords European Affairs Committee on 20 November 2024 on this very point and I would be happy to share this letter with noble Lords if that would be helpful. The letter sets out the importance this Government place on renewal of our EU adequacy decisions and the action we are taking to support this process.
It is important to recognise that the EU undertakes its review of its decisions for the UK in a unilateral, objective and independent way. As the DSIT Secretary of State referenced in his appearance before the Select Committee on 3 December, it is important that we acknowledge the technical nature of the assessments. For that reason, we respect the EU’s discretion about how it manages its adequacy processes. I echo some of the points made by the noble Viscount, Lord Camrose.
That being said, I reassure noble Lords that the UK Government are doing all they can to support a swift renewal of our adequacy status in both technical preparations and active engagement. The Secretary of State met the previous EU Commissioner twice last year to discuss the importance of personal data sharing between the UK and EU. He has also written to the new Commissioner for Justice responsible for the EU’s review and looks forward to meeting Commissioner McGrath soon.
I also reassure noble Lords that DSIT and the Home Office have dedicated teams that have been undertaking preparations ahead of this review, working across government as needed. Those teams are supporting European Commission officials with the technical assessment as required. UK officials have met with the European Commission four times since the introduction of the Bill, with future meetings already in the pipeline.
(2 months, 2 weeks ago)
Lords ChamberThe detection of breaks is done from land, but the ability to repair them is through an agreement with the commercial companies, which pay into a fund that allows a ship to be on 24/7 standby to provide protection. That is paid for by the companies that put the cables in place.
My Lords, we of course recognise and share the Government’s and House’s concern about increased Russian military activity around these undersea cables. I was pleased that the Minister a couple of times referenced the risk assessments going on, but can he tell the House a little more and expand on his earlier answers about those risk assessments? How do they take place and how often do they occur?
The national risk assessment is undertaken regularly and led by the Cabinet Office. In this instance, DSIT is the department responsible for the risk to the cables overall, but it is in collaboration with the MoD, the Cabinet Office and others, particularly in relation to assessing risks other than those that I have outlined.
(3 months ago)
Lords ChamberMy Lords, what a pleasure it is to address this compelling, balanced and, in my opinion, excellent report on large language models and generative AI. I thank not just my noble friend Lady Stowell but all noble Lords who were involved in its creation. Indeed, it was my pleasure at one point to appear before the committee in my former ministerial role. As ever, we are having an excellent debate today. I note the view of the noble Lord, Lord Knight, that it tends to be the usual suspects in these things, but very good they are too.
We have heard, particularly from my noble friend Lady Stowell and the noble Baroness, Lady Featherstone, about the need to foster competition. We have also heard about the copyright issue from a number of noble Lords, including the noble Baronesses, Lady Featherstone, Lady Wheatcroft and Lady Healy, and I will devote some more specific remarks to that shortly.
A number of speakers, and I agree with them, regretted the cancellation of the exascale project and got more deeply into the matter of compute and the investment and energy required for it. I hope the Minister will address that without rehearsing all the arguments about the black hole, which we can all probably recite for ourselves.
We had a very good corrective from the noble Lords, Lord Strasburger and Lord Griffiths of Bury Port, and my noble friend Lord Kamall, that the risks are far-reaching and too serious to treat lightly. In particular, I note the risk of deliberate misuse by powers out of our control. We heard about the need going forward for, if possible, greater clarity about regulatory plans and comparisons with the EU AI Act from my noble friend Lord Ranger. I very much enjoyed and respond to the remarks by the noble Lord, Lord Tarassenko, about data as a sovereign asset for the UK, whether in healthcare or anything else.
These points and all the points raised in the report underscore the immense potential of AI to revolutionise key sectors of our economy and our society, while also highlighting critical risks that must be addressed. I think we all recognise at heart the essential trade-off in AI policy. How do we foster the extraordinary innovation and growth that AI promises while ensuring it is deployed in ways that keep us safe?
However, today I shall focus more deeply on two areas. The first is copyright offshoring and the second is regulation strategy overall.
The issue of copyright and AI is deeply complex for many reasons. Many of them were very ably set out by my noble friend Lord Kamall. I am concerned that any solution that does not address the offshoring problem is not very far from pointless. Put simply, we could create between us the most exquisitely balanced, perfectly formed and simply explained AI regulation, but any AI lab that did not like it could, in many cases, scrape the same copyrighted content in another jurisdiction with regulations more to its liking. The EU’s AI Act addresses this problem by forbidding the use in the EU of AI tools that have infringed copyright during their training.
Even if this is workable in the EU—frankly, I have my doubts about that—there is a key ingredient missing that would make it workable anywhere. That ingredient is an internationally recognised technical standard to indicate copyright status, ownership and licence terms. Such a standard would allow content owners to watermark copyrighted materials. Whether the correct answer is pursuing an opt in or opt out of TDM is a topic for another day, but it would at least enable that to go forward technically. Crucially, it would allow national regulators to identify copyright infringements globally. Will the Minister say whether he accepts this premise and, if so, what progress he is aware of towards the development of an international technical standard of this kind?
I turn now to the topic of AI regulation strategy. I shall make two brief points. First, as a number of noble Lords put it very well, AI regulation has to adapt to fast-moving technology changes. That means that it has to target principles, rather than specific use cases where possible. Prescriptive regulation of technology does not just face early obsolescence, but relies fatally on necessarily rigid definitions of highly dynamic concepts.
Secondly, the application of AI is completely different across sectors. That means that the bulk of regulatory heavy lifting needs to be done by existing sector regulators. As set out in the previous Government’s White Paper, this work needs to be supported by central functions. Those include horizon scanning for future developments, co-ordination where AI cuts across sectors, supporting AI skills development, the provision of regulatory sandboxes and the development of data and other standards such as the ATRS. If these and other functions were to end up as the work of a single AI regulatory body, then so much the better, but I do not believe that such an incorporation is mission critical at this stage.
I was pleased that the committee’s report was generally supportive of this position and, indeed, refined it to great effect. Do the Government remain broadly aligned to this approach? If not, where will the differences lie?
While many of us may disagree to one degree or another on AI policy, I do not believe there is really any disagreement about what we are trying to achieve. We must seize this moment to champion a forward-looking AI strategy—one that places the UK at the forefront of global innovation while preserving our values of fairness, security, and opportunity for all.
Like the committee—or as we have heard from the noble Lord, Lord Griffiths, like many members of the committee—I remain at heart deeply optimistic. We can together ensure that AI serves as a tool to enhance lives, strengthen our economy, and secure our national interests. This is a hugely important policy area, so let me close by asking the Minister if he can update this House as regularly and frequently as possible on the regulation of AI and LLMs.
(3 months ago)
Lords ChamberThis is a critical question. The Royal Institute of Navigation has recently—in fact, today—launched a paper on how to prepare for this. It is something that all critical national infrastructure will be urged to look at, to have a plan for what would happen in the event of GPS failure. There is a longer-term question about the alternatives to space-based navigation and there is active work going on in the UK on terrestrial approaches, including the use of quantum systems to try to get a robust secondary approach to PNT.
My Lords, now that over 70 nations have their own space agency, how will the Government pursue the widest and most effective possible international co-operation in support of Astra Carta’s aim,
“to care for the infinite wonders of the universe”?
There is a series of international collaborations in place. We are a member of the European Space Agency. A large proportion of the £1.9 billion of the UK Space Agency money goes to the European Space Agency and our collaborators there. We also spend through the MoD and through UKRI. We are members of the UN bodies that deal with the question of a sustainable space sector and space environment. The space environment is increasingly important and needs attention. We will continue to raise this question at the UN bodies.
(3 months, 3 weeks ago)
Lords ChamberMy Lords, it has been an absolutely brilliant debate, and I join others in thanking the noble Viscount, Lord Stansgate, for bringing it forward. I also join others in congratulating the noble Baroness, Lady Freeman. Many years from now, eventually “Walking with Dinosaurs” will be a fantastic title for her memoir, but we are not there yet. I have been asked to slightly curtail my remarks and I am very happy to do that. I hope noble Lords will forgive me if I do not reflect on everything that has been said in the debate, but rather offer, just to begin with, some of my personal highlights from what I heard.
As a theme, it is clear that we are as one in deeply recognising and valuing the contribution that science and technology can and will make to our economy. Sadly, and frustratingly, many different approaches have been advanced as to how we can best finance that. I hope that we can be on the path of constant improvement to get more investment into this crucial space. I noted a sense of ruefulness from my noble friend Lord Willetts as he said that the role of the Science Minister was to extract money from the Treasury; I am pleased to say that we have somewhat moved on from this position.
I was very struck by the noble Baroness, Lady Neville-Jones, reminding us of the growing importance of international rivalry in this space. I think that is going to play an increasing part in our deliberations here.
The noble Lords, Lord St John of Bletso, Lord Tarassenko and Lord Drayson, asked, one way or another: where are our Metas or Alphabets? It is a question that certainly bugs me. Let us hope that, between us, we can move towards more of an answer. The noble Baroness, Lady Bowles, spoke powerfully about the issue of IP retention in universities, and that is clearly something we need to continue to look at.
The noble Lord, Lord Lucas, raised the issue of standards and regulations. There are not many silver bullets in technology regulation, but standards will be one of them. International global standards, particularly for instance with the copyright issue in AI, are going to be a big part of that solution.
I absolutely share the wish of the right reverend Prelate the Bishop of Newcastle to foster a faster-growing tech community in the north-east of England. If I may, I commend to her the work of the brilliant organisation CyberNorth; she may know it already.
Innovation is not merely an advantage; it is the foundation of economic growth and global competitiveness. Science and tech are no longer confined to laboratories or research institutions; they are part of the fabric of almost all the work we are doing of any kind across this country.
As of last year, we are one of three countries in the world with a trillion-dollar tech sector. Today, that sector contributes £150 billion annually to the UK economy, a figure that reflects not only the sector’s rapid growth to this point but its remarkable potential for expansion. With emerging fields that have been mentioned many times—quantum AI, engineering biology, and so on—we have the opportunity to cement the UK’s status as a global leader in scientific and technological innovation.
Of course, the contributions of science and tech, as I enjoyed hearing from the noble Baroness, Lady Bennett of Manor Castle, are not limited to economic growth. They enhance our resilience in the face of global challenges. I frequently argue that for all the amazing scientific advances we have seen over recent years, perhaps the most impactful was the development of the Covid vaccine, which I think we can all agree underscored, among other things, the power of UK-led scientific innovation, saving lives and demonstrating the critical impact of robust scientific infrastructure.
Investment in science and technology is also an investment in the workforce of tomorrow. The noble Lord, Lord Mair, and others raised this point very powerfully, as did my noble friend Lord Willetts and the noble Lord, Lord Taylor of Warwick. By prioritising education in STEM fields and by fostering partnerships between industry and academia, we are equipping future generations with the skills and knowledge required to thrive in a rapidly evolving landscape. It is not only essential for individual opportunity but vital to our ongoing economic competitiveness.
I want to address some pressing concerns raised by yesterday’s Budget. The Chancellor announced a significant allocation of £20.4 billion for research and development, including £6.1 billion aimed specifically at protecting core research funding. There is no doubt that this funding is crucial for advancing the core of our scientific curriculum. However, the research community has expressed some apprehensions regarding the implications of this. The Budget allocates an increased £2.7 billion for association with EU research programmes and covers the cost of the old Horizon Europe guarantee scheme. This means we are committing with this money not only to new funding but to managing the cost of past obligations. I would welcome some clarity from the Minister on how this is going to break down.
Further, as raised by my noble friend Lord Waldegrave, the abruptness of the decision over the summer to cancel the exascale computing investment—which was, by the way, fully funded through DSIT’s budget, contrary, I am afraid, to statements from the Government that I have heard from time to time—must stand as a significant red flag to AI investors, if only for its unexpectedness and suddenness. When we take this together with the additional costs and risks of hiring staff, the reduction of incentives to invest in technology and the—in my view, rather aggressive—treatment of non-domiciled investors, I think we have grounds for concern. I wonder whether, when the Minister rises, he could tell us to what he attributes our leadership today in science and tech. Is he concerned that these decisions may diminish that leadership and, if so, what do the Government propose to do about it?
That said, I am keen to close on a note of excitement and positivity. Ray Kurzweil, of “singularity” fame, argues that the time between major advances in science and technology diminishes exponentially. If he is right, the technologies available to us at the end of this Parliament will be truly staggering. So let us all be working together to make sure that as many of those breakthroughs as possible are delivered and safely exploited in this science and tech superpower, the United Kingdom.