(1 day, 11 hours ago)
Lords ChamberMy Lords, I declare an interest as chair of the Authors’ Licensing and Collecting Society. I offer the unequivocal and steadfast support from the Liberal Democrat Benches for Motion A1 in the name of the noble Baroness, Lady Kidron, which introduces Amendment 49F in lieu of Amendment 49D.
It is absolutely clear that the noble Baroness’s speeches become better and more convincing the more we go on. Indeed, the arguments being made today for these amendments become better and more convincing as time goes on. I believe we should stand firm, as the noble Lord, Lord Berkeley, said.
Time and time again, we all have had to address the narrative stated in the consultation paper and repeated by Ministers suggesting there is uncertainty or a lack of clarity in existing UK copyright law regarding AI training. We have heard that the Secretary of State has just recently acknowledged that the existing copyright law is “very certain”, but as I said to the noble Lord, Lord Liddle, he has also stated that
“it is not fit for purpose”.—[Official Report, Commons, 22/5/25; col. 1234.]
That makes the narrative even worse than saying that copyright law is uncertain.
As the noble Baroness, Lady Kidron, has rightly asserted, we do not need to change copyright law. It is the view of many that existing law is clear and applies to the commercial use of copyrighted works for AI training. The issue is not a deficient law but rather the ability to enforce it in the current AI landscape. As the noble Baroness has also profoundly put it—I have got a number of speeches to draw on, as you can see—what you cannot see, you cannot enforce. The core problem is a lack of transparency from AI developers: without knowing what copyrighted material has been used to train models and how it was accessed, creators and rights holders are unable to identify potential infringements and pursue appropriate licensing or legal action.
In striking down previous Lords amendments, the Government have suggested that this House was at fault for using the wrong Bill. They have repeatedly claimed that it is too soon for transparency and too late to prevent stealing, and they have asserted that accepting the Lords transparency amendment would prioritise one sector over another. But that is exactly what the Government are doing. They have suggested an expert working group, an economic impact assessment, a report on the use of copyright, and then, I think, a report on progress in what the noble Baroness the Minister had to say. But, as many noble Lords have said today, none of that gives us the legislative assurance —the certainty, as the noble Lord, Lord Brennan, put it—that we need in these circumstances.
The Government have objected to being asked to introduce regulations because of financial privilege, and now, it seems—I can anticipate what the noble Baroness the Minister is going to say—are objecting to the requirement to bring forward a draft Bill with this amendment. But the Government are perfectly at liberty to bring forward their own amendment allowing for transparency via regulations, a much more expeditious and effective route that the House has already overwhelmingly supported. Transparency is the necessary foundation for a functioning licensing market, promotes trust between the AI sector and the creative industries, and allows creators to be fairly compensated when their work contributes value to AI models.
The Government have asked for a degree of trust for their plans. This amendment, while perhaps less than creators deserve—I think the noble Baroness, Lady Kidron, described it as the bare minimum—is a step that would help earn that trust. It is this Government who can do that, and I urge them to heed the words of their own Back-Benchers: the noble Lords, Lord Cashman, Lord Rooker and Lord Brennan, all asked the Government to find a compromise.
I urge all noble Lords, in the face of a lack of compromise by the Government, to support Motion A1.
My Lords, as this is the third round of ping-pong, as many noble Lords have observed, I will speak very briefly. If the noble Baroness the Minister has not by now understood how strongly noble Lords on all sides of the House feel about this issue, it may be too late anyway.
The noble Baroness, Lady Kidron, has made an increasingly powerful case for the Government to act in defence of the rights of copyright owners, and we continue to call on the Government to listen. We have of course discussed this at great length. The noble Baroness has tabled a new Motion which would require Ministers to make a Statement and bring forward a draft Bill. Given that the Minister has expressed her sympathy for the concerns of your Lordships’ House previously, surely this new Motion would be acceptable to the Government as a pathway toward resolving the problem, and we again urge the Government to accept it.
However, whatever choice the Government make—I do not think anyone could claim that any part of this is an easy problem, as my noble friend Lord Vaizey pointed out—many of us are frustrated by the absence of agility, boldness and imagination in their approach. That said, speaking at least from the Front Bench of a responsible Opposition, we take the view that we cannot engage further in protracted ping-pong. We are a revising Chamber, and, although it is right to ask the Government to think again when we believe they have got it wrong, we feel we must ultimately respect the will of the elected Chamber.
My Lords, I must once again thank all noble Lords who have spoken during this debate, and of course I continue to recognise the passion and the depth of feeling on this issue.
I did not think I needed to reiterate this, but we absolutely believe in the importance of the creative sector, and of course we want it to have a flourishing future. In previous debates, I have spelled out all the work that we are doing with the creative sector and how fundamental it is to our economic planning going forward. I do not intend to go over that, but I have said it time and again from this Dispatch Box. Our intention is to find a substantial and workable solution to this challenge that we are all facing.
I also reassure the noble Lord, Lord Forsyth, and others that we have had numerous discussions with the noble Baroness, Lady Kidron, and others and have of course taken those discussions seriously. As a result, we have come today with an honest and committed plan to work together to resolve the contentious issue of AI and copyright both quickly and effectively.
(1 week, 6 days ago)
Lords ChamberThe noble Viscount will know that schools already have a policy, or are expected by the Department for Education to have one, to ensure that children do not have access to phones in schools. That is a clear policy that the Government are keen to reiterate. What we are talking about here is what children do outside the school environment. From July, the children’s code of practice will provide much greater reassurance and protection for children. Services will be expected to provide age-appropriate experiences online by protecting children from bullying, violent content, abuse and misogynistic content. In other words, there will be much more forceful regulation to specifically protect children. Obviously, we will continue to monitor the codes of practice, but there are specific new powers under the code that come into effect in July and we want to see their impact.
My Lords, I very much hope the Government are actively tracking and measuring the effects of schools’ own policies on mobile phone use during the school day. If so, what conclusions can be drawn about the wisdom of an outright ban? If they are not tracking that information, why not?
My Lords, as I said, the Department for Education’s mobile phones in schools guidance is clear that schools should prohibit the use of devices with smart technology throughout the school day, including during lessons, transitions and breaks. The Government expect all schools to take steps in line with that. Beyond that, my own department, DSIT, has commissioned a piece of research to look at young people’s use of social media and their access to it throughout the day. The outcome of the research is due very soon and we will learn the lessons from that. Up until now, the evidence has not been as clear-cut as we would like. We hope to learn on an international basis how to protect young people throughout the day, and will apply those lessons once the evidence has been assessed.
(2 weeks, 1 day ago)
Lords ChamberMy Lords, I thank the Minister for her introduction. In view of the remarks made a week ago by the Minister, the noble Lord, Lord Vallance, who referred to government datasets from the past 15 years which mixed up sex and gender as “accurate”—or perhaps “sort of accurate”, because the exchange in the report varied slightly—do the Government defend the accuracy of those datasets, even though they were, and continue to be, muddled because no one knew what “sex” meant? Are we expected to rely on the accuracy of data which mixed up sex and gender—that is, male and female—or do the Government mean that we cannot defend those data because they were only sort of accurate? I am not entirely clear what the Government are telling us about relying on historic data.
I am also concerned about what insight this gives into what the Government intend to regard as accurate from now on. I continue to think that the Government are on quite a sticky wicket in regard to data accuracy on sex and gender and their refusal to enshrine true sex accuracy in this Bill. We continue to have a bit of a fudge, which shakes confidence in their intentions. This is a huge missed opportunity, but I realise we are not having a further vote.
I shall ask just one question. Clause 29 allows for the Secretary of State to publish supplementary codes for DVS providers. Will the Government commit to publishing a supplementary code to ensure that DVS providers understand how to verify sex accurately and avoid what has been described by the Government Benches as the “muddle” of the last 15 years?
My Lords, I thank all noble Lords who have contributed to this important debate. I will first speak to the issues around accurate recording of sex data before coming on to talk about scientific research.
Throughout the passage of the Bill, we have been clear that digital verification services will be a significant driver of data reliability and productivity. They are absolutely dependent on accurate recording and rigorous management of data. We supported my noble friend Lord Lucas in his original amendments on Report, and we tabled our own amendments from the Front Bench for Lords consideration of Commons amendments last week.
I am grateful to the Minister for her engagement on this issue, and I know she has taken our concerns seriously. That said, we remain concerned about the accurate recording and management of sex data, especially in light of the recent judgment of the Supreme Court. The Government must continue to remain vigilant and to take steps to ensure datasets held by the Government and arm’s-length bodies are, and continue to be, accurate.
My Lords, I declare an interest as chair of the Authors’ Licensing and Collecting Society.
I express the extremely strong support of all on these Benches for Motion C1, proposed by the noble Baroness, Lady Kidron. I agree with every speech that we have heard so far in today’s debate—I did not hear a single dissenting voice to the noble Baroness’s Motion. Once again, I pay tribute to her; she has fought a tireless campaign for the cause of creators and the creative industries throughout the passage of the Bill.
I will be extremely brief, given that we want to move to a vote as soon as possible. The House has already sent a clear message by supporting previous amendments put forward by the noble Baroness, and I hope that the House will be as decisive today. As we have heard this afternoon, transparency is crucial. This would enable the dynamic licensing market that is needed, as we have also heard. How AI is developed and who it benefits are two of the most important questions of our time—and the Government must get the answer right. As so many noble Lords have said, the Government must listen and must think again.
My Lords, it is probably redundant to pay tribute to the noble Baroness, Lady Kidron, for her tenacity and determination to get to a workable solution on this, because it speaks for itself. It has been equally compelling to hear such strong arguments from all sides of the House and all Benches—including the Government Benches—that we need to find a solution to this complex but critical issue.
Noble Lords will recall that, on these Benches, we have consistently argued for a pragmatic, technology-based solution to this complex problem, having made the case for digital watermarking both in Committee and on Report. When we considered the Commons amendments last week, we worked closely with the noble Baroness, Lady Kidron, to find a wording for her amendment which we could support, and were pleased to be able to do so and to vote with her.
It is important that the Government listen and take action to protect the rights of creatives in the UK. We will not stop making the case for our flourishing and important creative sector. We have put that case to Ministers, both in your Lordships’ House and at meetings throughout the passage of the Bill. As a responsible Opposition, though, it is our view that we must be careful about our approach to amendments made by the elected House. We have, I hope, made a clear case to the Government here in your Lordships’ House and the Government have, I deeply regret to say, intransigently refused to act. I am afraid that they will regret their failure to take this opportunity to protect our creative industries. Sadly, there comes a point where we have to accept that His Majesty’s Government must be carried on and the Government will get their Bill.
Before concluding, I make two final pleas to the Minister. First, as others have asked, can she listen with great care to the many artists, musicians, news organisations, publishers and performers who have called on the Government to help them more to protect their intellectual property?
Secondly, can she find ways to create regulatory clarity faster? The process that the Government envisage to resolve this issue is long—too long. Actors on all sides of the debate will be challenged by such a long period of uncertainty. I understand that the Minister is working at pace to find a solution, but not necessarily with agility. I echo the brilliant point made by my noble friend Lady Harding that agility and delivering parts of the solution are so important to pick up the pace of this, because perfect is the enemy of good in this instance. When she gets up to speak, I hope that the Minister will tell us more about the timeline that she envisages, particularly for the collaboration of DSIT and DCMS.
This is a serious problem. It continues to grow and is not going away. Ministers must grip it with urgency and agility.
My Lords, once again, I acknowledge the passion and depth of feeling from those noble Lords who have spoken and, again, I emphasise that we are all on the same side here. We all want to see a way forward that protects our creative industries, while supporting everyone in the UK to develop and benefit from AI.
Of course, we have listened, and are continuing to listen, to the views that have been expressed. We are still going through the 11,500 responses to our consultation, and I have to tell noble Lords that people have proposed some incredibly creative solutions to this debate which also have a right to be heard.
This is not about Silicon Valley; it is about finding a solution for the UK creative and AI tech sectors that protects both. I am pleased that the noble Baroness, Lady Kidron, now endorses the Government’s reports as the right way to identify the right solutions; however, I will address some of her other points directly.
First, she talked about her amendment providing certainty to the creative industries. I can provide that certainty now, as Minister Bryant did in the other place last week. Copyright law in the UK is unchanged by this Bill. Works are protected unless one of the exemptions, which have existed for some time, such as those for teaching and research, applies, or the rights holders have guaranteed permission for their work to be used. That is the law now and it will be the law tomorrow.
I also want to reassure my noble friend Lord Cashman and the noble Baroness, Lady Benjamin, who talked about us stripping away rights today. I want to be clear that the Government have proposed no legislation on this issue; the Bill does no such thing. The amendment from the noble Baroness, Lady Kidron, would provide no certainty other than that of more uncertainty—of continuous regulations, stacked one upon another in a pile of instruments. This cannot be what anyone desires, and it is why the Government do not agree to it.
The noble Baronesses, Lady Kidron and Lady Harding, suggested that her amendment, requiring regulations on only one issue ahead of all others and via a different process, would somehow leave Parliament free to consider all the other issues independently. I am afraid that this is not the case; this is a policy decision with many moving parts. Jumping the gun on one issue will hamstring us in reaching the best outcome on all the others, especially because, as I said earlier, this is a global issue, and we cannot ring-fence the UK from the rest of the world.
We refute the suggestion that we are being complacent on this. I say to my noble friend Lord Brennan that I of course agree that the UK should be a global leader, but we need to make sure that we have the right approach before we plant our flag on that. There is a reason that no other territory has cracked this either. The EU, for example, is still struggling to find a workable solution. It is not easy, but we are working quickly.
The noble Baroness once again raised enforcement, and she has left the mechanism to the discretion of the Government in her new amendment. While we are pleased that the noble Baroness has changed her approach on enforcement in light of the Commons reasons, we all agree that for new transparency requirements to work, enforcement mechanisms will be needed and must be effective.
The noble Baroness said she has tried everything to persuade the Government, and I would have welcomed a further meeting with her to discuss this and other aspects of her revised proposals. Unfortunately, however, that invitation was not accepted. To reiterate, in spite of all our different positions on this Bill, we are all working towards the same goal.
Following proper consideration of consultation responses and publication of our technical reports, we will bring forward comprehensive and workable proposals that will give certainty to all sides. If the House has strong views when the proposals come forward, there will of course be the opportunity for us to debate them. We have made it clear that our reports will be delivered within 12 months and earlier if we can. I remind noble Lords that the amendments in the name of the noble Baroness, Lady Kidron, will not take effect for 18 months. There is not an instant solution, as many noble Lords want to hear today. Neither the noble Baroness’s nor our amendment is an instant solution; it will take time, and we have to recognise that.
We do not believe, in the meantime, that protracted ping-pong on this one remaining issue in the Bill is in anyone’s interest. The elected House has spoken twice and through legislative and non-legislative commitments, the Government have shown they are committed to regulating quickly and effectively. Therefore, I hope the noble Baroness and your Lordships’ House will accept these assurances and continue working with the Government to make progress on this important issue.
A lot has been said in this debate about the importance of transparency. To my noble friend Lord Brennan, I say that the Government have said from the very beginning that we will prioritise the issue of transparency in all the work we do. Transparency is essential to licensing; licensing is essential to the question of remuneration; and remuneration is essential to AI being high quality, effective and able to be deployed in the UK. These are the challenges we are facing, but all these things have to be addressed in the round and together, not in a piecemeal fashion. However, noble Lords are absolutely right to say that, without transparency, it is, of course, worth nothing.
On enforcement, the Government are sympathetic to the argument that it is a different matter for individuals to enforce their rights via the courts as opposed to large creative agencies. This is the kind of the thing that the working groups I have mentioned will explore. As Minister Bryant said last week, we want to make the new regime effective for everybody, large and small.
I will finish with some things I am sure we can all agree on: the urgency of the problem; the need to be evidence-based; that solutions will require collaboration between the creative and the AI sectors; and the solutions must work for everyone. I assure the noble Baroness, Lady Kidron, that everybody will have a seat at the table in the discussions. I hope noble Lords will agree with me and truly support the innovators and creators in the UK by voting with the Government on this Motion, which will deliver a full, comprehensive package that will make a difference to the creative sector for years to come in this country.
(3 weeks, 1 day ago)
Lords ChamberMy Lords, I will speak to some of the amendments made in the other place, starting with Amendments 1 to 31. These will ensure that smart data schemes can function optimally and that Part 1 is as clear as possible. Similarly, Amendments 35 to 42 from the other place reflect discussions on the national underground asset register with the devolved Governments. Finally, Amendments 70 to 79 make necessary consequential updates to the final provisions of the Bill and some updates to Schedules 11 and 15.
I will now speak to the amendments tabled by noble Lords, starting with those relating to sex data. Motion 32A disagrees with the amendment to remove Clause 28(3) and (4), and instead proposes changes to the initial drafting of those subsections. These would require the Secretary of State, when preparing the trust framework, to assess whether the 15 specified public authorities can reliably ascertain the data they collect, record and share. Amendment 32B limits this assessment to sex data, as defined through Amendment 32C; that definition limits sex to biological sex only and provides a definition of acquired gender.
It is also relevant to speak now to Motion 52A, which disagrees with the amendment to remove Clause 140 and, instead, suggests changes to the drafting. Clause 140, as amended by Amendment 52B, seeks to, through a regulation-making power, give the Secretary of State the ability to define sex as being only biological sex in certain areas or across public sector data processing more widely. Let me be clear that this Government accept the recent Supreme Court judgment on the definition of sex for the purposes of equality legislation. We need to work through the effects of this ruling holistically and with care, sensitivity and—dare I say it—kindness. In line with the law, we need to take care not to inappropriately extend its reach. This is not best done by giving the Secretary of State the power to define sex as biological in all cases through secondary legislation without appropriate scrutiny, given the potential impact on people’s human rights, privacy and dignity, and the potential to create legal uncertainty. Likewise, giving the Secretary of State a role in reviewing how other public authorities process sex data in all circumstances based on that definition would be inappropriate and disproportionate, and I note that the Supreme Court’s ruling relates specifically to the meaning of sex in equalities legislation.
The driver behind these amendments has been the importance of sex data being accurate when processed by public authorities. I strongly agree with that aim: accurate data is essential. This Government take data accuracy—including the existing legislation that requires personal data to be accurate—and data standards seriously. That is why we are addressing the question of sex information in public sector data. First, the EHRC is updating its statutory code of practice to support service providers in light of the Supreme Court judgment. Secondly, the Data Standards Authority is developing data standards on the monitoring of diversity information, including sex and gender data, and the effect of the Supreme Court judgment will be considered as part of that work.
Thirdly, the Office for Statistics Regulation published updated guidance on collecting and reporting data and statistics about sex and gender identity data last year. Fourthly, the Office for National Statistics published a work plan in December 2024 for developing harmonised standards on data more generally. Finally, the department is currently considering the implementation of the Sullivan review, published this year, which I welcome.
On digital verification services, I reassure noble Lords that these measures do not change the evidence that individuals rely on to prove things about themselves. The measures simply enable that to be done digitally. This Government are clear that data must be accurate for the purpose for which it is being used and must not be misleading. It should be clear to digital verification services what the information public authorities are sharing with them means. I will give an important example. If an organisation needs to know a person’s biological sex, this Government are clear that a check cannot be made against passport data, as it does not capture biological sex. DVS could only verify biological sex using data that records that attribute specifically, not data that records sex or gender more widely.
I know this is a concern of the noble Lord, Lord Arbuthnot, and I hope this provides some reassurance. The data accuracy principle of GDPR is part of existing law. That includes where data is misleading—this is a point I will return to. I hope that noble Lords find this commitment reassuring and, as such, will agree with Commons Amendment 32.
Motion 34A on Amendments 34B and 34C address the security of the national underground asset register. Security has always been at the heart of the national underground asset register. We have therefore listened to the well-thought-through concerns that prompted the amendment previously tabled by the noble Viscount, Lord Camrose, regarding cybersecurity. Following consideration, the Government are instead proposing an amendment we have drafted with support of colleagues in the security services. We believe this addresses the intention of ensuring the security of the national underground asset register data, with three key improvements.
First, it broadens the scope from cybersecurity only to the general security of information kept in or obtained from the national underground asset register. This will ensure that front-end users have guidance on a range of measures for security good practice—for example, personnel vetting, which should be considered for implementation—while avoiding the need to publish NUAR-specific cybersecurity features that should not be in the public domain. Secondly, it specifies the audience for this guidance; namely, users accessing NUAR. Finally, it broadens the scope of the amendment to include Northern Ireland alongside England and Wales, consistent with the NUAR measures overall. Clearly, it remains the case that access to NUAR data can be approved for purposes only by eligible users, with all access controlled and auditable. As such, I hope that noble Lords will be content to support government Motion 34A and Amendments 34B and 34C.
Commons Amendment 43, made in the other place, on scientific research removes the public interest test inserted in the definition of scientific research by the noble Viscount, Lord Colville. While recognising the concern the noble Lord raises, I want to be clear that anything that does not count as scientific research now would not do so under the Bill. Indeed, we have tightened the requirement and added a reasonableness test. The Bill contains strong safeguards. Adding precise definitions in the Bill would not strengthen these protections but impose a significant, new legal obligation on our research community at a time when, in line with the good work of the previous Government, we are trying to reduce bureaucracy for researchers, not increase it with new processes. The test proposed will lead to burgeoning bureaucracy and damage our world-leading research. This disproportionate step would chill basic and curiosity-driven research, and is not one we can support.
I beg to move that the House agree with the Commons in their Amendment 1. I have spoken to the other amendments.
My Lords, I first thank the Minister for his—as ever—clear and compelling remarks. I thank all noble Lords who have been working in a collegiate, collaborative fashion to find a way forward on the few but important remaining points of disagreement with the Government.
Before I come to the issue of accurate recording of personal data, I also thank the Minister, the noble Baroness, Lady Jones, for tabling the government amendments on the national underground asset register and her constructive engagement throughout the progress of the Bill.
As noble Lords will recall, I set out our case for stronger statutory measures to require the Secretary of State to provide guidance to relevant stakeholders on the cybersecurity measures that should be in place before they receive information from the national underground asset register. I am of course delighted that the Government have responded to the arguments that we and others made and have now tabled their own version of my amendment which would require the Secretary of State to provide guidance on the security of this data. We are happy to support them in that.
I turn to Motions 32A and 52A standing in my name, which seek to ensure that data is recorded accurately. They amend the original amendment, which my noble friends Lord Lucas and Lord Arbuthnot took through your Lordships’ House. My noble friend Lord Lucas is sadly unable to attend the House today, but I am delighted to bring these Motions forward from the Opposition Front Bench. In the other place, the Conservative Front Bench tabled new Clause 21, which would, we feel, have delivered a conclusive resolution to the problem. Sadly, the Government resisted that amendment, and we are now limited by the scope of the amendments of my noble friend Lord Lucas, so we were unable to retable the, in my view, excellent amendment in your Lordships’ House.
Moved by
32A: Leave out from “House” to end and insert “do disagree with the Commons in their Amendment 32, and do propose Amendments 32B and 32C to the words so restored to the Bill—
I thank the Minister for his very able summing up of his position, but I am afraid I cannot get past the question in my mind of how existing legacy data, even if it is managed by a DVS system going forward, will suddenly be of high quality when it is currently, as we know from the Sullivan report, in a muddle. As a result, for all his eloquence, I beg leave to test the opinion of the House.
My Lords, I thank the Minister for setting out the Government’s case so clearly. I will speak to my Amendment 46A, which seeks to improve the report that the Government brought forward in the other place. This issue is causing real concern for copyright owners and so many others in the creative industries. Let us remind ourselves that the creative industries contributed £124 billion in gross value added to the UK economy in 2023 and outperformed the UK economy between 2010 and 2023 in terms of growth. The Government are, wisely and rightly, prioritising growth over other concerns, and the creative industries will have to be an essential part of this—but only to the extent that they have a trusted and efficient marketplace for intellectual property.
Our amendment would improve the Government’s proposed report by adding consideration of extra territorial use of creators’ copyright works by operators of web crawlers and AI systems, as well as consideration of establishing a digital watermark for the purposes of identifying licensed content. I very much take on board the Minister’s point that this must be international to work, but few countries, if any, would have better or greater convening power to initiate the process of creating such digital standards. I urge the Government to pursue that avenue.
I pay tribute to all noble Lords who have raised the issue of copyright during the passage of this Bill. I am sure that I will be joining many others in thanking the noble Baroness, Lady Kidron, who has led such a powerful and successful campaign on this issue. Throughout the passage of the Bill, we have recognised the serious concerns raised by the creative sector and, on Report, we tabled an amendment seeking to create a digital watermark to identify this content and to protect copyright owners. I am very pleased that the Government have taken the first step by amending the Bill in the other place to put a report in it. That being said, the report needs to go further. If the Government are unwilling to accept our changes, I will test the opinion of the House when my amendment is called.
I turn briefly to Motion 49A, I the name of the noble Baroness, Lady Kidron. I once again pay tribute to the work that she has done to make progress on this. While we had concerns about the drafting of her amendment on Report, I am very pleased that she has tabled her Amendment 49B today. With the additional parts of it targeted at supporting small businesses and micro-entities, we are delighted to support it. It is increasingly clear that the Government must do the right thing for our creative industries, and we are delighted to offer our support to Motoin 49A. I intend to test the opinion of the House on Amendment 46A when it is called.
My Lords, I will speak to my Motion 49A and offer my support to Amendment 46A in the name of the noble Viscount, Lord Camrose. It is a sensible amendment and I hope that the Government find a way to accept it without challenge.
I start by rebutting three assertions that have been circling over the past few weeks. First, I reject the notion that those of us who have raised our voices against government plans are against technology. I quote the Secretary of State, Peter Kyle, who I am delighted to see is below Bar this afternoon. He said to the FT that:
“Just as in every other time there is change in society, there will be some people who will either resist change or try to make change too difficult to deliver”.
Well, creative people are early adopters of technology. Their minds are curious and their practices innovative. In my former career as a film director, I watched the UK film industry transform from working on celluloid to being a world-leading centre of digital production. For the past five years at Oxford’s Institute for Ethics in AI, where I am an advisor, I have been delighted to watch the leaps and bounds of AI development. Those at the frontier of AI development are creative thinkers, and creative people are natural innovators. The Government’s attempt to divide us is wrong.
The transformational impact of technology is something that all the signatories of this weekend’s letter to the Prime Minister understand. Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work and then rent it back from those who stole it. Ours is not an argument about progress but about value. The AI companies fiercely defend their own IP but deny the value of our work. Not everything new is progress, not everything that already exists is without value, but we, the creative industries, embody both change and tradition, and we reject the assertion that we are standing in the way of change. We are merely asserting our right to continue to exist and play our part in the UK’s future growth.
Secondly, there is no confusion about copyright law in relation to AI, nor does the phenomenal number of submissions to the consultation prove anything other than the widespread outrage of the creative industries that the Government sought to redefine theft rather than uphold their property rights. In our last debate, my noble and learned friend Lady Butler-Sloss made an unequivocal statement to that effect which has been widely supported by other legal opinion. The Government’s spokesman, who has greeted every press inquiry of the last few weeks by saying that the Government are consulting to sort out the confusion in copyright in relation to AI is, at best, misinformed. Let me be clear: the amendment would not change copyright. We do not need to change copyright law. We need transparency so that we can enforce copyright law, because what you cannot see you cannot enforce.
Thirdly, I rebut the idea that this is the wrong Bill and the wrong time. AI did not exist in the public realm until the early 2020s. The speed and scale at which copyright works are being stolen is eye-watering. Property that people have invested in, have created, have traded and that they rely on for their livelihood is being stolen at all parts of the value chain. It is an assault on the British economy, happening at scale to a sector worth £120 billion to the UK, an industry that is central to the industrial strategy and of enormous cultural import. It is happening now, and we have not even begun to catch up with the devastating consequences. The Government have taken our amendments out of the Bill and replaced them with a couple of toothless reports. Whatever these reports bring forward and whatever the consultation offers, we need the amendment in front of us today now. If this Bill does not protect copyright then, by the time that the Government work out their policy, there will be little to save.
The language of AI—scraping, training, data modules, LLMs—does not evoke the full picture of what is being done. AI corporations, many of which are seeking to entrench their existing information monopolies, are not stealing nameless data. They are stealing some of the UK’s most valuable cultural and economic assets—Harry Potter, the entire back catalogue of every music publisher in the UK, the voice of Hugh Grant, the design of an iconic handbag and the IP of our universities, great museums and library collections. Even the news is stolen in real time, all without payment, with economic benefits being taken offshore. It costs UK corporations and individuals their hard-earned wealth and the Treasury much needed revenue. It also denudes the opportunities of the next generation because, whether you are a corporation or an individual, if work is stolen at every turn, you cannot survive. The time is now, and this Bill is the vehicle.
Motion 49A replaces the previous package of Lords amendments. I pay tribute to the noble Lord, Lord Stevenson, who wishes he could be with us; the noble Lord, Lord Clement-Jones, and his colleagues, who have been uncompromising in their support; and my noble friend Lord Freyberg, who were all co-sponsors of the original amendment.
Amendment 49B would simply provide that a copyright holder be able to see who took their work, what was taken, when and why, allowing them a reasonable route to assert their moral right to determine whether they wish to have their work used, and if so, on what terms. It is a slimmer version of the previous package of amendments, but it covers the same ground and, importantly, it puts a timeline of 12 months on bringing forward these provisions and makes specific provision for SMEs and micro-entities and for UK-headquartered AI companies.
I thank the Minister for her full and detailed answer. Having heard the tone of the debate, I think it is clear that the focus and energy of the House are more on the amendment from the noble Baroness, Lady Kidron, but I am happy to take up the Minister’s offer of a further meeting.
52A: Leave out from “House” to end and insert “do disagree with the Commons in their Amendment 52, and do propose Amendments 52B and 52C to the words so restored to the Bill—
A little time has elapsed since the original debate, but I beg leave to test the opinion of the House.
(2 months, 1 week ago)
Grand CommitteeMy Lords, I thank the Minister for her introduction to this draft statutory instrument; it was brief and to the point. These penalties will be able to reach 10% of turnover or £100,000 per day for continuing breaches, so getting the calculations right is crucial. However, I have some concerns about the SI, the first of which is about timing.
I do not understand why we are looking at a three-year gap between the enabling powers and the calculation rules. The Telecommunications (Security) Act 2021, which I worked on, was presented to this House as urgent legislation to protect critical national infrastructure, yet here we are, in 2025, only now establishing how to calculate penalties for breaches in the way set out in this SI. During this period, we have had enforcement powers without the ability to properly determine penalties. As I understand it, tier 1 providers had to comply by March 2024, yet the penalty calculation mechanism will not be in place until this year—no doubt in a few weeks’ time.
Secondly, there is the absence of consultation. The Explanatory Memorandum cites the reason as the SI’s “technical nature”, but these penalties—I mentioned their size—could have major financial implications for providers. The telecoms industry has complex business structures and revenue streams. Technical expertise from the industry could have helped to ensure that these calculations are practical and comprehensive. The technical justification seems remarkably weak, given the impact these rules could have. For example, the current definition of “relevant business” for these calculations focuses on traditional network and service provision, but modern telecoms companies often have diverse revenue streams. There is no clear provision for new business models or technologies. How will we handle integrated service providers? What about international revenues? The treatment of associated services needs clarification.
Thirdly, the implementation sequence is an issue. We are being asked to approve penalty calculations before seeing the enforcement guidelines. There is no impact assessment, so we cannot evaluate potential consequences. I understand that the post-implementation review is not scheduled until 2026, and there is no clear mechanism for adjusting the framework if problems emerge. The interaction with the existing penalty regime needs clarification.
There are also technical concerns that need some attention. The switch from “notified provider” to “person” in the 2003 order, as a result of this SI, needs rather more explanation. The calculation method for continuing breaches is not fully detailed, there is no specific provision for group companies or complex corporate structures and the treatment of joint ventures and partnerships remains unclear.
Finally, I hope that, in broad terms, the Minister can give us an update on progress on the removal of equipment covered by the Telecommunications (Security) Act 2021. That was mandated by the Act; I know it is under way but it is not yet complete.
This is about not merely technical calculations but creating an effective deterrent to the telecoms industry, while ensuring fair and practical enforcement of important security measures. Getting these rules right is essential for both national security and our telecoms sector. I look forward to the Minister’s response on these points.
My Lords, I thank the Minister for bringing this important SI forward today and for setting it out so clearly and briefly. I also thank the noble Lord, Lord Clement-Jones. He made a range of interesting points: in particular, the point on timing was well made, and I look forward to hearing the Minister’s answers on that. This instrument seeks to implement provisions relating to the enforcement of designated vendor directions—DVDs—which form part of the broader framework established under the Telecommunications (Security) Act 2021. That Act, introduced under the previous Government, was designed to strengthen the security and resilience of the UK’s telecommunications networks, particularly in response to emerging national security risks.
We all know only too well that one of the most prominent issues at the forefront of this framework has been the removal of high-risk vendors, such as Huawei, from UK telecommunications infrastructure. Huawei’s involvement in the UK’s 5G rollout has long been a point of debate, with growing concerns about national security risks tied to its equipment. This SI therefore provides a mechanism for enforcing the penalties that may be applied to public communications providers —PCPs—that fail to comply with the DVDs to ensure that the UK’s telecommunications infrastructure remains secure from undue foreign influence.
The primary change introduced by this SI is the formalisation of the penalties regime for public communications providers that fail to comply with the conditions outlined in DVDs. It establishes a framework for calculating and enforcing penalties that may be imposed by the Secretary of State. The Secretary of State retains discretion in imposing penalties, but they must be applied in a proportionate manner. In considering penalties, the severity of the breach, the culpability of the provider and the broader implications for the sector must all be taken into account. The aim is to ensure compliance with DVDs while protecting the integrity of the UK’s national infrastructure.
However, while the objectives of this instrument are understood, this debate offers a good opportunity to scrutinise some of the specifics a little, particularly with regard to the proportionality of penalties and the potential economic consequences for the sector. It is with that in mind that I shall raise questions in just three areas regarding the provisions set out in this instrument.
First, the SI grants the Secretary of State significant discretion in the imposition of penalties. Of course, we recognise the value of flexibility here, but there is legitimate concern that this discretion may result in inconsistent enforcement across different public communications providers. Can the Minister assure us that transparency and accountability will be maintained throughout this process? How will the Government ensure that the application of penalties is fair and consistent, particularly when considering the varying size and scope of telecoms providers?
Further to this, can the Minister clarify how the penalties will be calculated? I echo the questions asked by the noble Lord, Lord Clement-Jones, particularly in cases where a breach does not pose an immediate or severe national security threat. Do the Government anticipate that penalties will be tiered with lesser fines for breaches that do not substantially compromise national security? Can the Minister further explain how such decisions will be communicated to the public and to industry to ensure transparency?
Secondly, providers are required to remove Huawei equipment from the UK’s 5G networks by 2027. This is, of course, a significant and costly task for telecom providers. Given these financial challenges, will the penalties for non-compliance take into account the costs already incurred by providers in replacing Huawei’s technology? Will the penalties be adjusted to reflect the substantial financial burden that these providers are already facing in removing Huawei equipment from their networks? Thirdly, where PCPs have been issued with a DVD, this can be a long and demanding process. How are the Government going to keep track of progress? What progress reports can be shared with Parliament and the public?
Is the Minister confident that the 2027 deadline will be met; that no vendor, purchaser or telecoms company will be caught by the Act; that no fines will be levied; and that what we are talking about today is, therefore, entirely theoretical?
While the Minister is working on her answer, perhaps she could include in that something about how progress against the delivery of these objectives will be reported to Parliament, potentially —and, indeed, to the public.
(3 months, 1 week ago)
Lords ChamberThe noble Earl is right, and we are trying to find a way to ensure that those rights are upheld. However, all these sectors need to grow in our economy. As I was just explaining, the creative sector uses AI, so it is not as simple “us and them” situation. AI is increasingly being used by all sectors across our economy. We need to find a way through this that rewards creators in the way that the noble Earl has outlined, which I think we all understand.
My Lords, I recognise of course that the task of analysing the results of the consultation still needs to go ahead. That said, does the Minister agree with us that digital watermarking is going to be a key component of the solution to the AI and copyright issue? If so, what does she make of the number of digital watermarking solutions that are now coming to market? In her view, is this to be welcomed or should we be pursuing a single standard for digital watermarks?
The noble Viscount has made an important point about watermarks, and that is certainly one solution that we are considering. The issue of transparency is crucial to the outcome of this issue, and watermarks would certainly help with that. I do not have a view as yet on whether we should have one or many, but I am hoping that the consultation will give us some guidance on that.
(5 months, 2 weeks ago)
Grand CommitteeMy Lords, I very much support the thrust of these amendments and what the noble Lord, Lord Knight, said in support of and in addition to them. I declare an interest as a current user of the national pupil database.
The proper codification of safeguards would be a huge help. As the noble Baroness, Lady Kidron, said, it would give us a foundation on which to build. I hope that, if they are going to go in this direction, the Government will take an immediate opportunity to do so because what we have here, albeit much more disorganised, is a data resource equivalent to what we have for the National Health Service. If we used all the data on children that these systems generate, we would find it much easier to know what works and in what circumstances, as well as how to keep improving our education system.
The fact that this data is tucked away in little silos—it is not shared and is not something that can be used on a national basis—is a great pity. If we have a national code as to how this data is handled, we enable something like the use of educational data in the way that the NHS proposes to use health data. Safeguards are needed on that level but the Government have a huge opportunity; I very much hope that it is one they will take.
I start by thanking all noble Lords who spoke; I enjoyed the vivid examples that were shared by so many of them. I particularly enjoyed the comment from the noble Lord, Lord Russell, about the huge gulf in difference between guidance, of which there is far too much, and a code that actually drives matters forward.
I will speak much more briefly because this ground has been well covered already. Both the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, seek to introduce codes of practice to protect the data of children in education services. Amendment 138 in the name of the noble Lord seeks to introduce a code on processing personal data in education. This includes consultation for the creation of such a code—a highly important element because the safety of this data, as well as its eventual usage, is of course paramount. Amendment 141 in the name of the noble Baroness, Lady Kidron, also seeks to set out a code of practice to provide heightened protections for children in education.
Those amendments are absolutely right to include consultation. It is a particularly important area of legislation. It is important that it does not restrict what schools can do with their data in order to improve the quality and productivity of their work. I was very appreciative of the words of the noble Lord, Lord Knight, when he sketched out some of the possibilities of what becomes educationally possible when these techs are wisely and safely used. With individual schools often responsible for the selection of technologies and their procurement, the landscape is—at the risk of understatement —often more complex than we would wish.
Alongside that, the importance of the AI Safety Institute’s role in consultation cannot be overstated. The way in which tech and AI have developed in recent years means that its expertise on how safely to provide AI to this particularly vulnerable group is invaluable.
I very much welcome the emphasis that these amendments place on protecting children’s data, particularly in the realm of education services. Schools are a safe place. That safety being jeopardised by the rapid evolution of technology that the law cannot keep pace with would, I think we can all agree, be unthinkable. As such, I hope that the Government will give careful consideration to the points raised as we move on to Report.
My Lords, I rise to make a brief but emphatic comment from the health constituency. We in the NHS have been victims of appalling cyber- hacking. The pathology labs in south London were hacked and that cost many lives. It is an example of where the world is going in the future unless we act promptly. The emphatic call for quick action so that government keeps up with world changes is really well made. I ask the Minister to reflect on that.
My Lords, I, too, shall speak very briefly, which will save valuable minutes in which I can order my CyberUp Christmas mug.
Amendments 156A and 156B add to the definition of unauthorised access, so that it includes instances where a person who accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and this person is not empowered to access by an enactment. Amendment 156B introduces defences to this new charge. Given the amount of valuable personal data held by controllers, as our lives have moved increasingly online—as many speakers in this debate have vividly brought out—there is absolutely clear merit not just in this idea but in the pace implied, which many noble Lords have called for. There is a need for real urgency here, and I look forward to hearing more detail from the Minister.
My Lords, I turn to Amendments 156A and 156B, tabled by the noble Lord, Lord Holmes. I understand the strength of feeling and the need to provide legal protections for legitimate cybersecurity activities. I agree with the noble Lord that the UK should have the right legislative framework to allow us to tackle the harms posed by cybercriminals. We have heard examples of some of those threats this afternoon.
I reassure the noble Lord that this Government are committed to ensuring that the Computer Misuse Act remains up to date and effective in tackling criminality. We will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies to consider whether there are workable proposals on this. The noble Lord will know that this is a complex and ongoing issue being considered as part of the review of the Computer Misuse Act being carried out by the Home Office. We are considering improved defences by engaging extensively with the cybersecurity industry, law enforcement agencies, prosecutors and system owners. However, engagement to date has not produced a consensus on the issue, even within the industry, and that is holding us back at this moment—but we are absolutely determined to move forward with this and to reach a consensus on the way forward.
I think the noble Lord, Lord Clement-Jones, said in the previous debate that the amendments were premature, and here that is certainly the case. The specific amendments that the noble Lord has tabled are premature, because we need a stronger consensus on the way forward, notwithstanding all the good reasons that noble Lords have given for why it is important that we have updated legislation. With these concerns and reasons in mind, I hope that the noble Lord will feel able to withdraw his amendment.
My Lords, the trouble with this House is that some have long memories. The noble Earl, Lord Erroll, reminded us all to look back, with real regret, at the Digital Economy Act and the failure to implement Part 3. I think that that was a misstep by the previous Government.
Like all of us, I warmly welcome the inclusion of data access provisions for researchers studying online safety matters in Clause 123 of the Bill. As we heard from the noble Baroness, Lady Kidron, and the noble Lord, Lord Knight, this was very much unfinished business from the Online Safety Act. However, I believe that, in order for the Bill to be effective and have the desired effect, the Government need to accept the amendments in the names of the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell. In terms of timeframe, the width of research possible, enforceability, contractual elements and location, they cover the bases extremely effectively.
The point was made extremely well by the noble Lords, Lord Bethell and Lord Russell, that we should not have to rely on brave whistleblowers such as Frances Haugen. We should be able to benefit from quality researchers, whether from academia or elsewhere, in order to carry out this important work.
My Amendment 198B is intended as a probing amendment about the definition of researchers under Clause 123, which has to be carefully drawn to allow for legitimate non-governmental organisations, academics and so on, but not so widely that it can be exploited by bad actors. For example, we do not want those who seek to identify potential exploits in a platform to use this by calling themselves “independent researchers” if they simply describe themselves as such. For instance, could Tommy Robinson seek to protect himself from liabilities in this way? After all, he called himself an “independent journalist” in another context when he clearly was not. I hope that when the Government come to draw up the regulations they will be mindful of the need to be very clear about what constitutes an independent or accredited researcher, or whatever phrase will be used in the context.
My Lords, although I have no amendments in this group, I will comment on some of them. I might jump around the order, so please forgive me for that.
Amendment 197 would change Clause 123 so that the Secretary of State must, as soon as reasonably practicable and no later than 12 months after the Act is passed, make regulations requiring regulated services to provide information for the purposes of research into online safety. This is clearly sensible. It would ensure that valuable research into online safety may commence as soon as possible, which would benefit us all, as speakers have made abundantly clear. To that end, Amendment 198D, which would ensure that researcher access is enforceable in the same way as other requirements under the Online Safety Act, would ensure that researchers can access valuable information and carry out their beneficial research.
I am still left with some curiosity on some of these amendments, so I will indicate where I have specific questions to those who have tabled them and hope they will forgive me if I ask to have a word with them between now and Report, which would be very helpful. In that spirit, I turn to Amendment 198B, which would allow the Secretary of State to define the term “independent researcher”. I ask the noble Lord, Lord Clement-Jones, who tabled the amendment, whether he envisages the Secretary of State taking advice before making such regulations and, if so, from whom and in what mechanism. I recognise that it is a probing amendment, but I would be keen to understand more.
I am also keen to understand further from my noble friend Lord Bethell and the noble Baroness, Lady Kidron, why, under Amendment 198A, the Secretary of State would not be able to make regulations providing for independent research into the “enforcement of requirements” under these regulations. Again, I look forward to discussing that with them.
I have some concerns about Amendment 198, which would require service providers to give information pertaining to age, stage of development, gender, race, ethnicity, disability and sexuality to researchers. I understand the importance of this but my concern is that it would require the disclosure of special category data to those researchers. I express reservations, especially if the data pertains to children. Do we have the right safeguards in place to address the obviously heightened risks here?
Additionally, I have some concerns about the provisions suggested in Amendment 198E. Should we allow researchers from outside the United Kingdom to require access to information from regulated service providers? Could this result in data being transferred into jurisdictions where there are less stringent data protection laws?
My Lords, I thank noble Lords who have welcomed the provisions in the Bill. I very much appreciate that we have taken on board the concerns that were raised in the debates on the previous legislation. I thank the noble Baroness, Lady Kidron, and the noble Lords, Lord Bethell and Lord Clement-Jones, for their amendments.
I will speak first to Amendment 197, tabled by the noble Baroness, Lady Kidron, which would compel the Secretary of State to create a framework and to do so within 12 months of passage. I understand and share her desire to ensure that a framework allowing researchers access is installed and done promptly. This is precisely why we brought forward this provision. I reassure her that the department will consult on the framework as soon as possible after the publication of Ofcom’s report.
Turning to Amendments 198 and 198B, tabled by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones, respectively, Clause 123 provides the Secretary of State with the power to make regulations relating to researchers’ access to data. I can reassure noble Lords that it does not limit the regulations to the non-exhaustive list of examples provided. I agree that fair and proportionate criteria for who is considered a researcher are critical to the success of the future framework. I reassure noble Lords that in the provision as currently written the Secretary of State can include in the design of the framework the specific requirements that a person must meet to be considered a researcher.
Turning to Amendments 198A and 198D, tabled by the noble Lord, Lord Bethell, while I am sympathetic to his desire to provide a future framework with the robust enforcement powers of the OSA, I assure him that as the provision is written, the Secretary of State can already use the existing enforcement powers of the OSA to support a future framework. Furthermore, should the evidence suggest that additional or different measures would be more effective and appropriate, this provision allows the Secretary of State the flexibility to introduce them.
Turning next to Amendments 198C and 198E, tabled by the noble Lord, Lord Bethell, I understand the spirit of these amendments and note the importance of this issue, given the global nature of the online world. It is entirely reasonable to allow researchers who are not based in the UK to utilise our researcher access framework, as long as the subject of their research is the experience of UK users online. I reassure him that the provisions as drafted already allow the Secretary of State to make regulations permitting non-UK-based researchers to use the framework where appropriate. We plan to use the evidence gathered through our own means and through Ofcom’s report to set out who will be eligible to use the framework in the secondary legislation.
Finally, turning to Amendment 198F, I am aware of the concern that researchers have encountered blockages to conducting research and I am sympathetic to the intentions behind the amendment. We must ensure that researchers can use the future framework without fear of legal action or other consequences. I am conscious that the noble Baroness, Lady Kidron, asked me a specific question about legal exemptions and I will write to her to make that answer much clearer. I reassure noble Lords that the Government are considering the specific issues that the noble Lord raises. For these reasons, I ask that the amendments not be pressed while the Government consider these issues further and I am of course happy to engage with noble Lords in the meantime.
My Lords, the UK is a world leader in genomics research. This research will no doubt result in many benefits, particularly in the healthcare space. However, genomics data can be, and increasingly is, exploited for deeply concerning purposes, including geostrategic ones.
Western intelligence agencies are reportedly becoming increasingly concerned about China using genomic data and biotechnology for military purposes. The Chinese Government have made it clear that genomics plays a key part in the civilian-military doctrine. The 13th five-year plan for military-civil fusion calls for the cross-pollination of military and civilian technology such as biotechnology. This statement, taken in conjunction with reports that the Beijing Genomics Institute—the BGI—in collaboration with the People’s Liberation Army, is looking to make ethnically Han Chinese soldiers less susceptible to altitude sickness, makes for worrying reading. Genetically engineered soldiers appear to be moving out of fiction and towards reality.
The global genomics industry has grown substantially as a result of the Covid-19 pandemic and gene giant BGI Group and its affiliated MGI Tech have acquired large databases of DNA. Further, I note that BGI has widespread links to the Chinese state. It operates the Government’s key laboratories and national gene bank, itself a vast repository of DNA data drawn from all over the world. A Reuters investigation found that a prenatal test, NIFTY, sold by BGI to expectant mothers, gathered millions of women’s DNA data. This prenatal test was developed in collaboration with the Chinese military.
For these reasons, I think we must become far more protective of genomic data gathered from our population. While many researchers use genomic data to find cures for terrible diseases, many others, I am afraid, would use it to do us harm. To this end, I have tabled Amendment 199 to require the Secretary of State and the Information Commissioner to conduct frequent risk assessments on data privacy associated with genomics and DNA companies headquartered in countries that are systemic competitors or hostile actors. I believe this will go some way to preventing genomic data transfer out of the UK and to countries such as China that may use it for military purposes. I beg to move.
My Lords, I strongly support this amendment. As a former Minister, I was at the front line of genomic data and know how powerful it currently is and can be in the future. Having discussed this with the UK Biobank, I know that the issue of who stores and processes genomic data in the UK is a subject of huge and grave concern. I emphasise that the American Government have moved on this issue already and emphatically. There is the possibility that we will be left behind in global standards and will one day be an outlier if we do not close this important and strategically delicate loophole. For that reason, I strongly support this amendment.
My Lords, I thank the noble Viscount, Lord Camrose, for moving this amendment, which raises this important question about our genomics databases, and for the disturbing examples that he has drawn to our attention. He is right that the opportunities from harnessing genomic data come with very real risks. This is why the Government have continued the important work of the UK Biological Security Strategy of 2023, including by conducting a full risk assessment and providing updated guidance to reduce the risks from the misuse of sensitive data. We plan to brief the Joint Committee on the National Security Strategy on the findings of the risk assessment in the new year. Following that, I look forward to engaging with the noble Viscount on its outcome and on how we intend to take these issues forward. As he says, this is a vital issue, but in the meantime I hope he is prepared to withdraw his amendment.
I thank the Minister for her answer, and I very much accept her offer of engagement. I will make a few further brief comments about the importance of this amendment, as we go forward. I hope that other noble Lords will consider it carefully before Report.
I will set out a few reasons why I believe this amendment can benefit both the Bill and this country. The first is its scope. The amendment will allow the Secretary of State and the Information Commissioner to assess data security risks across the entirety of the genomic sector, covering consumers, businesses, citizens and researchers who may be partnering with state-linked genomics companies.
The second reason is urgency. DNA is regularly described as the “new gold” and it represents our most permanent identifier, revealing physical and mental characteristics, family medical history and susceptibility to diseases. Once it has been accessed, the damage from potential misuse cannot be researched, and this places a premium on proactively scrutinising the potential risks to this data.
Thirdly, there are opportunities for global leadership. This amendment offers the UK an opportunity to take a world-leading role and become the first European country to take authoritative action to scrutinise data vulnerabilities in this area of critical technology. Scrutinising risks to UK genomic data security also provides a foundation to foster domestic genomics companies and solutions.
Fourthly, this amendment would align the UK with key security partners, particularly, as my noble friend Lord Bethell mentioned, the United States, which has already blacklisted certain genomics companies linked to China and taken steps to protect American citizens’ DNA from potential misuse.
The fifth and final reason is protection of citizens and consumers. This amendment would provide greater guidance and transparency to citizens and consumers whose DNA data is exposed to entities linked to systemic competitors. With all of that said, I thank noble Lords for their consideration and beg leave to withdraw my amendment.
My Lords, we have had some powerful speeches in this group, not least from the noble Baronesses, Lady Kidron and Lady Owen, who drafted important amendments that respond to the escalating harms caused by AI-generated sexual abuse material relating to children and adults. The amendment from the noble Baroness, Lady Kidron, would make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material. As she outlined and the noble Lord, Lord Bethell, confirmed, it specifically would become an offence to create, train or distribute generative AI models that enable the creation of computer-generated CSAM or priority legal content; to train AI models on CSAM or priority illegal content; or to possess AI models that produce CSAM or priority legal content.
This amendment responds to a growing problem, as we have heard, around computer-generated sexual abuse material and a gap in the law. There is a total lack of safeguards preventing bad actors creating sexual abuse imagery, and it is causing real harm. Sites enabling this abuse are offering tools to harm, humiliate, harass, coerce and cause reputational damage. Without robust legal frameworks, victims are left vulnerable while perpetrators operate with impunity.
The noble Lord, Lord Bethell, mentioned the Internet Watch Foundation. In its report of July, One Step Ahead, it reported on the alarming rise of AI-generated CSAM. In October 2023, in How AI is Being Abused to Create Child Sexual Abuse Imagery, it made recommendations to the Government regarding legislation to strengthen legal frameworks to better address the evolving landscape of AI-generated CSAM and enhance preventive measures against its creation and distribution. It specifically recommended:
“That the Government legislates to make it an offence to use personal data or digital information to create digital models or files that facilitate the creation of AI or computer-generated child sexual abuse material”.
The noble Baroness, Lady Kidron, tabled such an amendment to the previous Bill. As she said, she was successful in persuading the then Government to accept it; I very much hope that she will be as successful in persuading this Government to accept her amendment.
Amendments 211G and 211H in the name of the noble Baroness, Lady Owen, are a response to the extraordinary fact that one in 14 adults has experienced threats to share intimate images in England and Wales; that rises to one in seven among young women. Research from Internet Matters shows that 49% of young teenagers in the UK aged between 13 and 16—around 750,000 children—said that they were aware of a form of image-based abuse being perpetrated against another young person known to them.
We debated the first of the noble Baroness’s amendments, which is incorporated in her Bill, last Friday. I entirely agree with the noble Lord, Lord Knight; I did not find the Government’s response at all satisfactory. I hope that, in the short passage of time between then and now, they have had time to be at least a little agile, as he requested. UK law clearly does not effectively address non-consensual intimate images. It is currently illegal to share or threaten to share non-consensual intimate images, including deepfakes, but creating them is not yet illegal; this means that someone could create a deepfake image of another person without their consent and not face legal consequences as long as they do not share, or threaten to share, it.
This amendment is extremely welcome. It addresses the gap in the law by criminalising the creation of non-consensual intimate images, including deepfakes. It rightly targets deepfakes due to their rising prevalence and potential for harm, particularly towards women. Research shows that 98% of deepfake videos online are pornographic, with 99% featuring women and girls. This makes it an inherently sexist problem that is a new frontier of violence against women—words that I know the noble Baroness has used.
I also very much welcome the new amendment not contained in her Bill, responding to what the noble Baroness, Lady Gohir, said at its Second Reading last Friday about including audio deepfakes. The words “shut down every avenue”, which I think were used by the noble Baroness, Lady Gohir, are entirely apposite in these circumstances. Despite what the noble Lord, Lord Ponsonby, said on Friday, I hope that the Government will accept both these amendments and redeem their manifesto pledge to ban the creation of sexually explicit deepfakes, whether audio or video.
My Lords, the current law does not sufficiently protect children from AI-driven CSAM because it is simply such a fast-moving issue. It is a sobering thought that, of all the many wonderful developments of AI that many of us have been predicting and speculating on for so long, CSAM is really driving the technology forward. What a depressing reflection that is.
Overall, AI is developing at an extraordinarily rapid pace and has come with a number of concerning consequences that are not all yet fully understood. However, it is understood that child sexual abuse is completely unacceptable in any and all contexts, and it is right that our law should be updated to reflect the dangers that have increased alongside AI development.
Amendment 203 seeks to create a specific offence for using personal data or digital information to create or facilitate the creation of computer-generated child sexual abuse material. Although legislation is in place to address possessing or distributing such horrendous material, we must prioritise the safety of children in this country and take the law a step further to prevent its creation. Our children must be kept safe and, subject to one reservation, which I will come to in a second, I support the amendment from the noble Baroness, Lady Kidron, to further protect them.
That reservation comes in proposed new subsection 1(c), which includes in the offence the act of collating files that, when combined, enable the creation of sexual abuse material. This is too broad. A great deal of the collation of such material can be conducted by innocent people using innocent materials that are then corrupted or given more poisonous aspects by further training, fine-tuning or combination with other materials by more malign actors. I hope there is a way we can refine this proposed new paragraph on that basis.
Unfortunately, adults can also be the targets of individuals who use AI to digitally generate non-consensual explicit images or audio files of an individual, using their likeness and personal data. I am really pleased that my noble friend Lady Owen tabled Amendments 211G and 211H to create offences for these unacceptable, cruel acts. I support these amendments unambiguously.
My Lords, I thank the noble Baroness, Lady Kidron, for her Amendment 203. It goes without saying that the Government treat all child sexual abuse material with the utmost seriousness. I can therefore confirm to her and the Committee that the Government will bring forward legislative measures to address the issue in this Session and that the Home Office will make an announcement on this early in the new year.
On Amendments 211G and 211H, tabled by the noble Baroness, Lady Owen, the Government share concerns that more needs to be done to protect women from deepfake image abuse. This is why the Government committed in their manifesto to criminalise the creation of sexually explicit deepfake images of adults. I reassure the noble Baroness and the whole Committee that we will deliver on our manifesto commitment in this Session. The Government are fully committed to protecting the victims of tech-enabled sexual abuse. Tackling intimate audio would be a new area of law, but we continue to keep that legislation under review.
I also say to the noble Baroness that there is already a process under Section 153 of the Sentencing Act 2020 for the court to deprive a convicted offender of property, including images that have been used for the purpose of committing or facilitating any criminal offence. As well as images, that includes computers and mobile phones that the offender either used to commit intimate image offences or intended to use for that purpose in future. For those reasons and the reassurances I have given today, I hope that noble Lords will feel able to withdraw or not press their amendments.
(5 months, 2 weeks ago)
Grand CommitteeMy Lords, the debate on this group emphasises how far behind the curve we are, whether it is by including new provisions in this Bill or by bringing forward an AI Bill—which, after all, was promised in the Government’s manifesto. It emphasises that we are not moving nearly fast enough in thinking about the implications of AI. While we are doing so, I need to declare an interest as co-chair of the All-Party Parliamentary Group on AI and a consultant to DLA Piper on AI policy and regulation.
I have followed the progress of AI since 2016 in the capacity of co-chair of the all-party group and chair of the AI Select Committee. We need to move much faster on a whole range of different issues. I very much hope that the noble Lord, Lord Vallance, will be here on Wednesday, when we discuss our crawler amendments, because although the noble Lord, Lord Holmes, has tabled Amendment 211A, which deals with personality rights, there is also extreme concern about the whole area of copyright. I was tipped off by the noble Lord, Lord Stevenson, so I was slightly surprised that he did not bring our attention to it: we are clearly due the consultation at any moment on intellectual property, but there seems to be some proposal within it for personality rights themselves. Whether that is a quid pro quo for a much-weakened situation on text and data mining, I do not know, but something appears to be moving out there which may become clear later this week. It seems a strange time to issue a consultation, but I recognise that it has been somewhat delayed.
In the meantime, we are forced to put forward amendments to this Bill trying to anticipate some of the issues that artificial intelligence is increasingly giving rise to. I strongly support Amendments 92, 93, 101 and 105 put forward by the noble Viscount, Lord Colville, to prevent misuse of Clause 77 by generative AI developers; I very much support the noble Lord, Lord Holmes, in wanting to see protection for image, likeness and personality; and I very much hope that we will get a positive response from the Minister in that respect.
We have heard from the noble Baronesses, Lady Kidron and Lady Harding, and the noble Lords, Lord Russell and Lord Stevenson, all of whom have made powerful speeches on previous Bills—the then Online Safety Bill and the Data Protection and Digital Information Bill—to say that children should have special protection in data protection law. As the noble Baroness, Lady Kidron, says, we need to move on from the AADC. That was a triumph she gained during the passage of the Data Protection Act 2018, but six years later the world looks very different and young people need protection from AI models of the kind she has set out in Amendment 137. I agree with the noble Lord, Lord Stevenson, that we need to talk these things through. If it produces an amendment to this Bill that is agreed, all well and good, but it could mean an amendment or part of a new AI Bill when that comes forward. Either way, we need to think constructively in this area because protection of children in the face of generative AI models, in particular, is extremely important.
This group, looking forward to further harms that could be caused by AI, is extremely important on how we can mitigate them in a number of different ways, despite the fact that these amendments appear to deal with quite a disparate group of issues.
My Lords, I too thank all noble Lords for their insightful contributions to this important group of amendments, even if some of them bemoaned the fact that they have had to repeat themselves over the course of several Bills. I am also very heartened to see how many people have joined us for Committee today. I have been involved in only two of these sittings, but this is certainly a record, and on present trends it is going to be standing room only, which is all to the good.
I have two observations before I start. First, we have to acknowledge that perhaps this area is among the most important we are going to discuss. The rights and protections of data subjects, particularly children, are in many ways the crux of all this and we have to get it right. Secondly, I absolutely take on board that there is a real appetite to get ahead of something around AI legislation. I have an amendment I am very excited about later when we come particularly to ADM, and there will be others as well, but I absolutely take on board that we need to get going on that.
Amendment 92 in the names of the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, seeks to reduce the likelihood of the misuse of Clause 77 by AI model developers who may seek to claim that they do not need to notify data subjects of reuse for scientific purposes under that clause. This relates to the way that personal data is typically collected and processed for AI development. Amendment 93 similarly seeks to reduce the possibility of misuse of Clause 77 by model developers who could claim they do not need to notify data subjects of reuse for scientific purposes. Amendment 101 also claims to address the potential misuse of Clause 77 by the developers, as does Amendment 105. I strongly support the intent of amendments from the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, in seeking to maintain and make provisions for the rights and protections of data subjects, and look forward very much to hearing the views of the Minister.
I turn to Amendment 137 in the names of the noble Lords, Lord Russell and Lord Stevenson, and the noble Baronesses, Lady Kidron and Lady Harding. This amendment would require the commissioner to prepare and produce a code of practice which ensures that data processors prioritise the interests, rights and freedoms of children. It goes without saying that the rights and protection of children are of utmost importance. Certainly, this amendment looks to me not only practical but proportionate, and I support it.
Finally, Amendment 211A in the name of my noble friend Lord Holmes ensures the prohibition of
“the development, deployment, marketing and sale of data related to an individual’s image, likeness or personality for AI training”
without that person’s consent. Like the other amendments in this group, this makes provision to strengthen the rights and protections of data subjects against the potential misuse or sale of data and seems entirely sensible. I am sure the Minister has listened carefully to all the concerns powerfully raised from all sides of the Committee today. It is so important that we do not lose sight of the importance of the rights and protection of data subjects.
My Lords, I welcome the amendments spoken to so well by the noble Baroness, Lady Harding, regarding the open electoral register. They are intended to provide legal certainty around the use of the register, without compromising on any aspect of the data privacy of UK citizens or risking data adequacy. The amendments specify that companies are exempt from the requirement to provide individuals with information in cases where their personal data has not been obtained directly from them if that data was obtained from the open electoral register. They also provide further clarification on what constitutes “disproportionate effort” under new paragraph 5(e) of Article 14 of GDPR.
The noble Baroness covered the ground so effectively that all I need to add is that the precedent established by the current interpretation by the tribunal will affect not only the open electoral register but other public sources of data, including the register of companies, the Registry of Judgments, Orders and Fines, the Land Registry and the Food Standards Agency register. Importantly, it may even prevent the important work being done to create a national data library achieving its objectives of public sector data sharing. It will have far-reaching implications if we do not change the Bill in the way that the noble Baroness has put forward.
I thank the noble Lord, Lord Lucas, for his support for Amendment 160. I reciprocate in supporting—or, at least, hoping that we get clarification as a result of—his Amendments 158 and 161.
Amendment 159B seeks to ban what are colloquially known as cookie paywalls. As can be seen, it is the diametric opposite to Amendment 159A, tabled by the noble Viscount, Lord Camrose. For some unaccountable reason, cookie paywalls require a person who accesses a website or app to pay a fee to refuse consent to cookies being accessed from or stored on their device. Some of these sums can be extortionate and exorbitant, so I was rather surprised by the noble Viscount’s counter amendment.
Earlier this year, the Information Commissioner launched a call for views which looked to obtain a range of views on its regulatory approach to consent or pay models under data protection law. The call for views highlighted that organisations that are looking to adopt, or have already adopted, a consent-or-pay model must consider the data protection implications.
Cookie paywalls are a scam and reduce people’s power to control their data. I wonder why someone must pay if they do not consent to cookies being stored or accessed. The PEC regulations do not currently prohibit cookie paywalls. The relevant regulation is Regulation 6, which is due to be substituted by Clause 111, and is supplemented by new Schedule A1 to the PEC regulations, as inserted by Schedule 12 to the Bill. The regulation, as substituted by Clause 111 and Schedule 12, does not prohibit cookie paywalls. This comes down to the detail of the regulations, both as they currently are and as they will be if the Bill remains as drafted. It is drafted in terms that do not prevent a person signifying lack of consent to cookies, and a provider may add or set controls—namely, by imposing requirements—for how a person may signify that lack of consent. Cookie paywalls would therefore be completely legal, and they certainly have proliferated online.
This amendment makes it crystal clear that a provider must not require a person to pay a fee to signify lack of consent to their data being stored or accessed. This would mean that, in effect, cookie paywalls would be banned.
Amendment 160 is sought by the Advertising Association. It seeks to ensure that the technical storage of or access to information is considered necessary under paragraph 5 of the new Schedule A1 to the PEC regulations inserted by Schedule 12 if it would support measurement or verification of the performance of advertising services to allow website owners to charge for their advertising services more accurately. The Bill provides practical amendments to the PEC regulations through listing the types of cookies that no longer require consent.
This is important, as not all cookies should be treated the same and not all carry the same high-level risks to personal privacy. Some are integral to the service and the website itself and are extremely important for subscription-free content offered by publishers, which is principally funded by advertising. Introducing specific and target cookie exemptions has the benefit of, first, simplifying the cookie consent banner, and, secondly, increasing further legal and economic certainty for online publishers. As I said when we debated the DPDI Bill, audience measurement is an important function for media owners to determine the consumption of content, to be able to price advertising space for advertisers. Such metrics are crucial to assess the effectiveness of a media channel. For sites that carry advertising, cookies are used to verify the delivery and performance of a digital advertisement—ie, confirmation that an ad has been served or presented to a user and whether it has been clicked on. This is essential information to invoice an advertiser accurately for the number of ad impressions in a digital ad campaign.
My reading of the Bill suggests that audience measurement cookies would be covered under the list of exemptions from consent under Schedule 12, however. Can the Government confirm this? Is it the Government’s intention to use secondary legislation in future to exempt ad performance cookies?
Coming to Amendment 162 relating to the soft opt-in, I am grateful to the noble Lord, Lord Black of Brentwood, and the noble Baroness, Lady Harding of Winscombe, for their support. This amendment would enable charities to communicate to donors in the same way that businesses have been able to communicate to customers since 2003. The clause will help to facilitate greater fundraising and support the important work that charities do for society. I can do no better than quote from the letter that was sent to Secretary of State Peter Kyle on 25 November, which was co-ordinated by the DMA and involved nearly 20 major charities, seeking support for reinstating the original Clause 115 of the DPDI Bill into this Bill:
“Clause 115 of the previous DPDI Bill extended the ‘soft opt-in’ for email marketing for charities and non-commercial organisations. The DMA estimates that extending the soft opt-in to charities would increase annual donations in the UK by £290 million”,
based on analysis of 13.1 million donors by the Salocin Group. The letter continues:
“At present, the DUA Bill proposals remove this. The omission of the soft opt-in will prevent charities from being able to communicate to donors in the same way as businesses can. As representatives of both corporate entities and charitable organisations, it is unclear to the DMA why charities should be at a disadvantage in this regard”.
I hope that the Government will listen to the DMA and the charities involved.
I thank noble Lords for their comments and contributions. I shall jump to Amendments 159 and 159A, one of which is in my name and both of which are concerned with cookie paywalls. I am not sure I can have properly understood the objection to cookie paywalls. Do they not simply offer users three choices: pay money and stay private; share personal data and read for free; or walk away? So many times, we have all complained about the fact that these websites harvest our data and now, for the first time, this approach sets a clear cash value on the data that they are harvesting and offers us the choice. The other day somebody sent me a link from the Sun. I had those choices. I did not want to pay the money or share my data, so I did not read the article. I feel this is a personal decision, supported by clear data, which it is up to the individual to take, not the Government. I do not think we should take away this choice.
Let me turn to some of the other amendments in this group. Amendment 161 in the name of my noble friend Lord Lucas is, if I may say so, a thoughtful amendment. It would allow pension providers to communicate information on their product. This may mean that the person who will benefit from that pension does not miss out on useful information that would benefit their saving for retirement. Given that pension providers already hold the saver’s personal data, it seems to be merely a question of whether this information is wanted; of course, if it is not, the saver can simply opt out.
Amendment 162 makes an important point: many charities rely on donations from the public. Perhaps we should consider bringing down the barriers to contacting people regarding fundraising activities. At the very least, I am personally not convinced that members of the public have different expectations around what kinds of organisation can and cannot contact them and in what circumstances, so I support any step that simplifies the—to my mind—rather arbitrary differences in the treatment of business and charity communications.
Amendment 104 certainly seems a reasonable addition to the list of what might constitute “unreasonable effort” if the information is already public. However, I have some concerns about Amendments 98 and 100 to 103. For Amendment 98, who would judge the impact on the individual? I suspect that the individual and the data controllers may have different opinions on this. In Amendment 100, the effort and cost of compliance are thorny issues that would surely be dictated by the nature of the data itself and the reason for providing it to data subjects. In short, I am concerned that the controllers’ view may be more subjective than we would want.
On Amendment 102, again, when it comes to providing information to them,
“the damage and distress to the data subjects”
is a phrase on which the subject and the controller will almost inevitably have differing opinions. How will these be balanced? Additionally, one might presume that information that is either damaging or distressing to the data subjects should not necessarily be withheld from them as it is likely to be extremely important.
My Lords, we have covered a range of issues in our debate on this grouping; nevertheless, I will try to address each of them in turn. I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Harding, for their Amendments 95, 96, 98, 100, 102 to 104 and 106 regarding notification requirements.
First, with regard to the amendments in the name of the noble Baroness, Lady Harding, I say that although the Government support the use of public data sources, transparency is a key data protection principle. We do not agree that such use of personal data should remove or undermine the transparency requirements. The ICO considers that the use and sale of open electoral register data alone is likely not to require notification. However, when the data is combined with data from other sources, in order to build an extensive profile to be sold on for direct marketing, notification may be proportionate since the processing may go beyond the individual’s reasonable expectations. When individuals are not notified about processing, it makes it harder for them to exercise their data subject rights, such as the right to object.
Adding other factors to the list of what constitutes a “disproportionate effort” for notification is unnecessary given that the list is already non-exhaustive. The “disproportionate effort” exemption must be applied according to the safeguards of the wider data protection framework. According to the fairness principle, controllers should already account for whether the processing meets the reasonable expectations of a data subject. The data minimisation and purpose limitation principles also act as an important consideration for data controllers. Controllers should continue to assess on a case-by-case basis whether they meet the threshold for the existing exemptions to notify; if not, they should notify. I hope that this helps clarify our position on that.
My Lords, I rise briefly to support my friend, the noble Lord, Lord Clement-Jones, and his string of amendments. He made the case clearly: it is simply about access, the right to redress and a clear pathway to that redress, a more efficient process and clarity and consistency across this part of our data landscape. There is precious little point in having obscure remedies or rights—or even, in some cases, as we have discussed in our debates on previous groups, no right or obvious pathways to redress. I believe that this suite of amendments addresses that issue. Again, I full-throatedly support them.
My Lords, I address the amendments tabled by the noble Lord, Lord Clement-Jones. These proposals aim to transfer jurisdiction from courts to tribunals; to establish a new right of appeal against decisions made by the Information Commissioner; and to grant the Lord Chancellor authority to implement tribunal procedure rules. I understand and recognise the noble Lord’s intent here, of course, but I have reservations about these amendments and urge caution in accepting them.
The suggestion to transfer jurisdiction from courts to tribunals raises substantial concerns. Courts have a long-standing authority and expertise in adjudicating complex legal matters, including data protection cases. By removing these disputes from the purview of the courts, the risk is that we undermine the depth and breadth of legal oversight required in such critical areas. Tribunals, while valuable for specialised and expedited decisions, may not provide the same level of rigorous legal analysis.
Cases such as those cited by the noble Lord, Lord Clement-Jones—Killock and another v the Information Commissioner and Delo v the Information Commissioner—demonstrate to me the intricate interplay between data protection, administrative discretion and broader legal principles. It is questionable whether tribunals, operating under less formal procedures, can consistently handle such complexities without diminishing the quality of justice. Further, I am not sure that the claim that this transfer will streamline the system and reduce burdens on the courts is fully persuasive. Shifting cases to tribunals does not eliminate complexity; it merely reallocates it, potentially at the expense of the detailed scrutiny that these cases demand.
I turn to the right of appeal against the commissioner’s decisions. Although the introduction of a right of appeal against these decisions may seem like a safeguard, it risks creating unnecessary layers of litigation. The ICO already operates within a robust framework of accountability, including judicial review for cases of legal error or improper exercise of discretion. Adding a formal right of appeal risks encouraging vexatious challenges, overwhelming the tribunal system and diverting resources from addressing genuine grievances.
I think we in my party understand the importance of regulatory accountability. However, creating additional mechanisms should not come at the expense of efficiency and proportionality. The existing legal remedies are designed to strike an appropriate balance, and further appeals risk creating a chilling effect on the ICO’s ability to act decisively in protecting data rights.
On tribunal procedure rules and centralised authority, the proposed amendment granting the Lord Chancellor authority to set tribunal procedure rules bypasses the Tribunal Procedure Committee, an independent body designed to ensure that procedural changes are developed with judicial oversight. This move raises concerns about the concentration of power and the erosion of established checks and balances. I am concerned that this is a case of expediency overriding the principles of good governance. While I acknowledge that consultation with the judiciary is included in the amendment, it is not a sufficient substitute for the independent deliberative processes currently in place. The amendment risks undermining the independence of our legal institutions and therefore I have concerns about it.
These amendments overall, while presented as technical fixes, and certainly I recognise the problem and the intent, would have far-reaching consequences for our data protection framework. The vision of my party for governance is one that prioritises stability, legal certainty and the preservation of integrity. We must avoid reforms that, whatever their intent, introduce confusion or inefficiency or undermine public trust in our system. Data protection is, needless to say, a cornerstone of our modern economy and individual rights. As such, any changes to its governance must be approached with the utmost care.
I thank the noble Lord, Lord Clement-Jones, for his Amendments 108, 146 to 153 and 157, and I am grateful for the comments by the noble Lord, Lord Holmes, and the noble Viscount, Lord Camrose.
The effect of this group of amendments would be to make the First-tier Tribunal and the Upper-tier Tribunal responsible for all data protection cases. They would transfer ongoing as well as future cases out of the court system to the relevant tribunals and, as has been alluded to, may cause more confusion in doing so.
As the noble Lord is aware, there is currently a blend of jurisdiction under the data protection legislation for both tribunals and courts according to the nature of the proceedings in question. This is because certain types of cases are appropriate to fall under tribunal jurisdiction while others are more appropriate for court settings. For example, claims by individuals against organisations for breaches of legal requirements can result in awards of compensation for the individuals and financial and reputational damage for the organisations. It is appropriate that such cases are handled by a court in conformance with their strict procedural and evidential rules. Indeed, under the Killock and Delo examples, it was noted that there could be additional confusion in that ability to go between those two possibilities if you went solely to one of the tribunals.
On the transfer of responsibility for making tribunal procedural rules from the Tribunal Procedure Committee to the Lord Chancellor, we think that would be inappropriate. The committee is comprised of legal experts appointed or nominated by senior members of the judiciary or the Lord Chancellor. This committee is best placed to make rules to ensure that tribunals are accessible and fair and that cases are dealt with quickly and efficiently. It keeps the rules under constant review to ensure that they are fit for purpose in line with new appeal rights and the most recent legislative changes.
Amendment 151 would also introduce a statutory appeals procedure for tribunals to determine the merits of decisions made by the Information Commissioner. Data subjects and controllers alike can already challenge the merits of the Information Commissioner’s decisions by way of judicial review in a way that would preserve the discretion and independence of the Information Commissioner’s decision-making, so no statutory procedure is needed. The Government therefore believe that the current jurisdictional framework is well-balanced and equitable, and that it provides effective and practical routes of redress for data subjects and controllers as well as appropriate safeguards to ensure compliance by organisations. For these reasons, I hope the noble Lord will not press his amendments.
My Lords, in speaking to this group of amendments I must apologise to the Committee that, when I spoke last week, I forgot to mention my interests in the register, specifically as an unpaid adviser to the Startup Coalition. For Committee, noble Lords will realise that I have confined myself to amendments that may be relevant to our healthcare and improving that.
I will speak to Amendments 111 and 116 in the names of my noble friends Lord Camrose and Lord Markham, and Amendment 115 from my noble friend Lord Lucas and the noble Lords, Lord Clement-Jones and Lord Knight of Weymouth, as well as other amendments, including from my noble friend Lord Holmes—I will probably touch on most amendments in this group. To illustrate my concerns, I return to two personal experiences that I shared during debate on the Data Protection and Digital Information Bill. I apologise to noble Lords who have heard these examples previously, but they illustrate the points being made in discussing this group of amendments.
A few years ago, when I was supposed to be travelling to Strasbourg, my train to the airport got delayed. My staff picked me up, booked me a new flight and drove me to the airport. I got to the airport with my new boarding pass and scanned it to get into the gate area, but as I was about to get on the flight, I scanned my pass again and was not allowed on the flight. No one there could explain why, having been allowed through security, I was not allowed on the flight. To cut a long story short, after two hours of being gaslighted by four or five staff, with them not even saying that they could not explain things to me, I eventually had to return to the check-in desk—this was supposed to be avoided by all the automation—to ask what had happened. The airline claimed that it had sent me an email that day. The next day, it admitted that it had not sent me an email. It then explained what had happened by saying that a flag had gone off in its system. That was simply the explanation.
This illustrates the point about human intervention, but it is also about telling customers and others what happens when something goes wrong. The company clearly had not trained its staff in how to speak to customers or in transparency. Companies such as that airline get away with this sort of disgraceful behaviour all the time, but imagine if such technology were being used in the NHS. Imagine the same scenario: you turn up for an operation, and you scan your barcode to enter the hospital—possibly even the operating theatre—but you are denied access. There must be accountability, transparency and human intervention, and, in these instances, there has to be human intervention immediately. These things are critical.
I know that this Bill makes some sort of differentiation between more critical and less critical ADM, but let me illustrate my point with another example. A few years ago, I paid for an account with one of those whizzy fintech banks. Its slogan was: “We are here to make money work for everyone”. I downloaded the app and filled out the fields, then a message popped up telling me, “We will get back to you within 48 hours”. Two weeks later, I got a message on the app saying that I had been rejected and that, by law, the bank did not have to explain why. Once again, I ask noble Lords to imagine. Imagine Monzo’s technology being used on the NHS app, which many people currently use for repeat prescriptions or booking appointments. What would happen if you tried to book an appointment but you received a message saying, “Your appointment has been denied and, by law, we do not have to explain why”? I hope that we would have enough common sense to ensure that there is human intervention immediately.
I realise that the noble Lord, Lord Clement-Jones, has a Private Member’s Bill on this issue—I am sorry that I have not been able to take part in those debates—but, for this Bill, I hope that the two examples I have just shared illustrate the point that I know many noble Lords are trying to make in our debate on this group of amendments. I look forward to the response from the Minister.
I thank all noble Lords who have spoken. I must confess that, of all the groups we are looking at today, I have been particularly looking forward to this one. I find this area absolutely fascinating.
Let me begin in that spirit by addressing an amendment in my name and that of my noble friend Lord Markham and I ask the Government and all noble Lords to give it considerable attention. Amendment 111 seeks to insert the five principles set out in the AI White Paper published by the previous Government and to require all those participating in ADM—indeed, all forms of AI—to have due regard for them. They are:
“safety, security and robustness, appropriate transparency and explainability, fairness, accountability and governance, and contestability and redress”.
These principles for safe AI are based on those originally developed with the OECD and have been the subject of extensive consultation. They have been refined and very positively received by developers, public sector organisations, private sector organisations and civil society. They offer real safeguards against the risks of AI while continuing to foster innovation.
I will briefly make three brief points to commend their inclusion in the Bill, as I have described. First, the Bill team has argued throughout that these principles are already addressed by the principles of data protection and so are covered in the Bill. There is overlap, of course, but I do not agree that they are equivalent. Data protection is a significant concern in AI but the risks and, indeed, the possibilities of AI go far further than data protection. We simply cannot entrust all our AI risks to data protection principles.
Secondly, I think the Government will point to their coming AI Bill and suggest that we should wait for that before we move significantly on AI. However, in practice all we have to go on about the Bill—I recognise that Ministers cannot describe much of it now—is that it will focus on the largest AI labs and the largest models. I assume it will place existing voluntary agreements on a statutory footing. In other words, we do not know when the Bill is coming, but this approach will allow a great many smaller AI fish to slip through the net. If we want to enshrine principles into law that cover all use of AI here, this may not quite be the only game in town, but it is certainly the only all-encompassing, holistic game in town likely to be positively impactful. I look forward to the Minister’s comments on this point.
The Secretary of State can help describe specific cases in the future but, on the point made by my noble friend Lord Knight, the ICO guidance will clarify some of that. There will be prior consultation with the ICO before that guidance is finalised, but if noble Lords are in any doubt about this, I am happy to write and confirm that in more detail.
Amendment 115 in the names of the noble Lords, Lord Clement-Jones, Lord Lucas and Lord Knight, and Amendment 123A in the name of the noble Lord, Lord Holmes, seek to ensure that individuals are provided with clear and accessible information about solely automated decision-making. The safeguards set out in Clause 80, alongside the wider data protection framework’s safeguards, such as the transparency principle, already achieve this purpose. The UK GDPR requires organisations to notify individuals about the existence of automated decision-making and provide meaningful information about the logic involved in a clear and accessible format. Individuals who have been subject to solely automated decisions must be provided with information about the decisions.
On Amendment 116 in the names of the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, I reassure noble Lords that Clause 69 already provides a definition of consent that applies to all processing under the law enforcement regime.
On Amendment 117 in the names of the noble Viscount, Lord Camrose, the noble Lords, Lord Markham, and my noble friend Lord Knight, I agree with them on the importance of protecting the sensitive personal data of children by law enforcement agencies, and there is extensive guidance on this issue. However, consent is rarely used as the basis for processing law enforcement data. Other law enforcement purposes, such as the prevention, detection and investigation of crime, are quite often used instead.
I will address Amendment 118 in the name of the noble Viscount, Lord Camrose, and Amendment 123B in the name of the noble Lord, Lord Holmes, together, as they focus on obtaining human intervention for a solely automated decision. I agree that human intervention should be carried out competently and by a person with the authority to correct a wrongful outcome. However, the Government believe that there is currently no need to specify the qualifications of human reviewers as the ICO’s existing guidance explains how requests for human review should be managed.
Does the Minister agree that the crux of this machinery is solely automated decision-making as a binary thing—it is or it is not—and, therefore, that the absolute key to it is making sure that the humans involved are suitably qualified and finding some way to do so, whether by writing a definition or publishing guidelines?
On the question of qualification, the Minister may wish to reflect on the broad discussions we have had in the past around certification and the role it may play. I gently her take her back to what she said on Amendment 123A about notification. Does she see notification as the same as a personalised response to an individual?
My Lords, I had expected the noble Baroness, Lady Owen of Alderley Edge, to be in the Room at this point. She is not, so I wish to draw the Committee’s attention to her Amendment 210. On Friday, many of us were in the Chamber when she made a fantastic case for her Private Member’s Bill. It obviously dealt with a much broader set of issues but, as we have just heard, the overwhelming feeling of the House was to support her. I think we would all like to see the Government wrap it up, put a bow on it and give it to us all for Christmas. But, given that that was not the indication we got, I believe that the noble Baroness’s intention here is to deal with the fact that the police are giving phones and devices back to perpetrators with the images remaining on them. That is an extraordinary revictimisation of people who have been through enough. So, whether or not this is the exact wording or way to do it, I urge the Government to look on this carefully and positively to find a way of allowing the police the legal right to delete data in those circumstances.
My Lords, none of us can be under any illusion about the growing threats of cyberattacks, whether from state actors, state-affiliated actors or criminal gangs. It is pretty unusual nowadays to find someone who has not received a phishing email, had hackers target an account or been promised untold riches by a prince from a faraway country. But, while technology has empowered these criminals, it is also the most powerful tool we have against them. To that end, we must do all we can do to assist the police, the NCA, the CPS, the SIS and their overseas counterparts in countries much like our own. That said, we must also balance this assistance with the right of individuals to privacy.
Regarding the Clause 81 stand part notice from the noble Lord, Lord Clement-Jones, I respectfully disagree with this suggestion. If someone within the police were to access police records in an unauthorised capacity or for malign reasons, I simply doubt that they would be foolish enough to enter their true intentions into an access log. They would lie, of course, rendering the log pointless, so I struggle to see—we had this debate on the DPDI Bill—how this logging system would help the police to identify unauthorised access to sensitive data. It would simply eat up hours of valuable police time. I remember from our time working on the DPDI Bill that the police supported this view.
As for Amendment 124, which allows for greater collaboration between the police and the CPS when deciding charging decisions, there is certainly something to be said for this principle. If being able to share more detailed information would help the police and the CPS come to the best decision for victims, society and justice, then I absolutely support it.
Amendments 126, 128 and 129 seek to keep the UK in close alignment with the EU regarding data sharing. EU alignment or non-alignment is surely a decision for the Government of the day alone. We should not look to bind a future Administration to the EU.
I understand that Amendment 127 looks to allow data transfers to competent authorities—that is, law enforcement bodies in other countries—that may have a legitimate operating need. Is this not already the case? Are there existing provisions in the Bill to facilitate such transfers and, if so, does this not therefore duplicate them? I would very much welcome the thoughts of both the Minister and the noble Lord, Lord Clement-Jones, when he sums up at the end.
Amendment 156A would add to the definition of “unauthorised access” so that it includes instances where a person accesses data in the reasonable knowledge that the controller would not consent if they knew about the access or the reason for the access, and the person is not empowered to access it by an enactment. Given the amount of valuable personal data held by controllers as our lives continue to move online, there is real merit to this idea from my noble friend Lord Holmes, and I look forward to hearing the views of the Minister.
Finally, I feel Amendment 210 from my noble friend Lady Owen—ably supported in her unfortunate absence by the noble Baroness, Lady Kidron—is an excellent amendment as it prevents a person convicted of a sexual offence from retaining the images that breached the law. This will prevent them from continuing to use the images for their own ends and from sharing them further. It would help the victims of these crimes regain control of these images which, I hope, would be of great value to those affected. I hope that the Minister will give this serious consideration, particularly in light of noble Lords’ very positive response to my noble friend’s Private Member’s Bill at the end of last week.
I think the noble Viscount, Lord Camrose, referred to Amendment 156A from the noble Lord, Lord Holmes—I think he will find that is in a future group. I saw the Minister looking askance because I doubt whether she has a note on it at this stage.
I thank the noble Lord, Lord Clement-Jones; let me consider it a marker for future discussion.
I thank the noble Lord, Lord Clement-Jones, for coming to my rescue there.
I turn to the Clause 81 stand part notice tabled by the noble Lord, Lord Clement-Jones, which would remove Clause 81 from the Bill. Section 62 of the Data Protection Act requires law enforcement agencies to record their processing activities, including their reasons for accessing and disclosing personal information. Entering a justification manually was intended to help detect unauthorised access. The noble Lord was right that the police do sometimes abuse their power; however, I agree with the noble Viscount, Lord Camrose, that the reality is that anyone accessing the system unlawfully is highly unlikely to record that, making this an ineffective safeguard.
Meanwhile, the position of the National Police Chiefs’ Council is that this change will not impede any investigation concerning the unlawful processing of personal data. Clause 81 does not remove the strong safeguards that ensure accountability for data use by law enforcement that include the requirement to record time, date, and where possible, who has accessed the data, which are far more effective in monitoring potential data misuse. We would argue that the requirement to manually record a justification every time case information is accessed places a considerable burden on policing. I think the noble Lord himself said that we estimate that this clause may save approximately 1.5 million policing hours, equivalent to a saving in the region of £42.8 million a year.
These four technical government amendments do not, we believe, have a material policy effect but will improve the clarity and operation of the Bill text.
Amendment 133 amends Section 199 of the Investigatory Powers Act 2016, which provides a definition of “personal data” for the purposes of bulk personal datasets. This definition cross-refers to Section 82(1) of the Data Protection Act 2018, which is amended by Clauses 88 and 89 of the Bill, providing for joint processing by the intelligence services and competent authorities. This amendment will retain the effect of that cross-reference to ensure that processing referred to in Section 199 of the IPA remains that done by an intelligence service.
Amendment 136 concerns Clause 92 and ICO codes of practice. Clause 92 establishes a new procedure for panels to consider ICO codes of practice before they are finalised. It includes a regulation-making power for the Secretary of State to disapply or modify that procedure for particular codes or amendments to them. Amendment 136 will enable the power to be used to disapply or modify the panel’s procedure for specific amendments or types of amendments to a code, rather than for all amendments to it.
Finally, Amendments 213 and 214 will allow for changes made to certain immigration legislation and the Online Safety Act 2023 by Clauses 55, 122 and 123 to be extended via existing powers in those Acts, exercisable by Orders in Council, to Guernsey and the Isle of Man, should they seek this.
I beg to move.
My Lords, I will keep my comments brief as these are all technical amendments to the Bill. I understand that Amendments 133 and 136 are necessary for the functioning of the law and therefore have no objection. As for Amendment 213, extending immigration legislation amended by Clause 55 of this Bill to the Bailiwick of Guernsey or the Isle of Man, this is a sensible measure. The same can be said for Amendment 214, which extends the provision of the Online Safety Act 2023, amended by this Bill, to the Bailiwick of Guernsey or the Isle of Man.
My Lords, given the hour, I will try to be as brief as possible. I will start by speaking to the amendments tabled in my name.
Amendment 142 seeks to prevent the Information Commissioner’s Office sending official notices via email. Official notices from the ICO will not be trivial: they relate to serious matters of data protection, such as monetary penalty notices or enforcement notices. My concern is that it is all too easy for an email to be missed. An email may be filtered into a spam folder, where it sits for weeks before being picked up. It is also possible that an email may be sent to a compromised email address, meaning one that the holder has lost control of due to a hacker. These concerns led me also to table Amendment 143, which removes the assumption that a notice sent by email had been received within 48 hours of being sent.
Additionally, I suspect I am right in saying that a great many people expect official correspondence to arrive via the post. I wonder, therefore, whether there might be a risk that people ignore an unexpected email from the ICO, concerned that it might well be a scam or a hack of some description. I, for one, am certainly deeply suspicious of unexpected but official-looking messages that arrive. I believe that official correspondence which may have legal ramifications should really be sent by post.
On some of the other amendments tabled, Amendment 135A, which seeks to introduce a measure from the DPDI Bill, makes provision for the introduction of a statement of strategic priorities by the Secretary of State that sets out the Government’s data protection priorities, to which the commissioner must have regard, and the commissioner’s duties in relation to the statement. Although I absolutely accept that this measure would create more alignment and efficiency in the way that data protection is managed, I understand the concerns that it would undermine the independence of the Information Commissioner’s Office. That in itself, of course, would tend to bear on the adequacy risk.
I do not support the stand part notices on Clauses 91 and 92. Clause 91 requires the Information Commissioner to prepare codes of practice for the processing of data, which seems a positive measure. It provides guidance to controllers, helping them to control best practice when processing data, and is good for data subjects, as it is more likely that their data will be processed in an appropriate manner. As for Clause 92, which would effectively increase expert oversight of codes of practice, surely that would lead to more effective codes, which will benefit both controllers and data subjects.
I have some concerns about Amendment 144, which limits the Information Commissioner to sending only one reprimand to a given controller during a fixed period. If a controller or processor conducts activities that infringe the provisions of the GDPR and does so repeatedly, why should the commissioner be prevented from issuing reprimands? Indeed, what incentives does that give for people to commit a minor sin and then a major one later?
I welcome Amendment 145, in the name of the noble Baroness, Lady Kidron, which would ensure that the ICO’s annual report records activities and action taken by the ICO in relation to children. This would clearly give the commissioner, parliamentarians and the data and tech industry as a whole a better understanding of how policies are affecting children and what changes may be necessary.
Finally, I turn my attention to many of the amendments tabled by the noble Lord, Lord Clement-Jones, which seek to remove the involvement of the Secretary of State from the functions of the commissioner and transfer the responsibility from government to Parliament. I absolutely understand the arguments the noble Lord advances, as persuasively as ever, but I am concerned even so that the Secretary of State for the relevant department is the best person to work with the commissioner to ensure both clarity of purpose and rapidity of decision-making.
I wanted to rise to my feet in time to stop the noble Viscount leaping forward as he gets more and more excited as we reach—I hope—possibly the last few minutes of this debate. I am freezing to death here.
I wish only to add my support to the points of the noble Baroness, Lady Kidron, on Amendment 145. It is much overused saw, but if it is not measured, it will not get reported.
(5 months, 3 weeks ago)
Lords ChamberMy Lords, of course I must start by joining others in thanking the noble Lord, Lord Clement-Jones, for bringing forward this timely and important Bill, with whose aims we on these Benches strongly agree. As public bodies take ever more advantage of new technological possibilities, surely nothing is more critical than ensuring that they do so in a way that adheres to principles of fairness, transparency and accountability.
It was also particularly helpful to hear from the noble Lord the wide range of very specific examples of the problems caused by what I will call AADM for brevity. I felt that they really brought it to life. I also take on board the point made by the noble Lord, Lord Knight, about hiring and firing by AADM. The way this is done is incredibly damaging and, frankly, if I may say so, too often simply boneheaded.
The point by the noble Baroness, Lady Lane-Fox, about procurement is absolutely well founded: I could not agree more strongly that this is a crucial area for improvement. That point was well supported by the noble Baroness, Lady Freeman of Steventon, as well. I thought that the argument, powerful as ever, from noble Lord, Lord Tarassenko, for sovereign AI capabilities was also particularly useful, and I hope that the Government will consider how to take that forward. Finally, I really welcomed the point made so eloquently by the noble Baroness, Lady Hamwee, in reminding us that just the existence of a human in the loop is a completely insufficient condition for making these things effective.
We strongly support the goal of this Bill: to ensure trustworthy AI that deserves public confidence, fosters innovation and contributes to economic growth. However, the approach proposed, raises—for me, anyway—several concerns that I worry could hinder its effectiveness.
First, definition is a problem. Clause 2(1) refers to “any algorithmic … systems” but, of course, “algorithmic” can have a very broad definition: it can encompass any process, even processes that are unrelated to digital or computational systems. While the exemptions in subsections (2) and (4) are noted, did the noble Lord give consideration to adopting or incorporating the AI White Paper’s definition around autonomy and adaptiveness, or perhaps just the definition around AADM used in the DUA Bill, which we will no doubt be discussing much more on Monday? We feel that improving the definition would provide some clarity and better align the scope with the Bill’s purpose.
I also worry that the Bill fails to address the rapid pace of AI development. For instance, I worry that requiring ongoing assessments for every update under Clause 3(3) is impractical, given that systems often change daily. This obligation should be restricted to significant changes, thereby ensuring that resources are spent where they matter most.
I worry, too, about the administrative burden that the Bill may create. For example, Clause 2(1) demands a detailed assessment even before a system is purchased. I feel that that is unrealistic, particularly with pilot projects that may operate in a controlled way but in a production environment, not in a test environment as described in Clause 2(2)(b). Would that potentially risk stifling exploration and innovation, and, indeed, slowing procurement within the public sector?
Another area of concern is communication. It is so important that AI gains public trust and that people come to understand the systems and the safeguards in place around them. I feel that the Bill should place greater emphasis on explaining decisions to the general public in ways that they can understand rapidly, so that we can ensure that transparency is not only achieved but perceived.
Finally, the Bill is very prescriptive in nature, and I worry that such prescriptiveness ends up being ineffective. Would it be a more effective approach, I wonder, to require public bodies to have due regard for the five principles of AI outlined in the White Paper, allowing them the flexibility to determine how best to meet those standards, but in ways that take account of the wildly differing needs, approaches and staffing of the public bodies themselves? Tools such as the ATRS could obviously be made available to assist, but I feel that public bodies should have the agency to find the most effective solutions for their own circumstances.
Let me finish with three questions for the Minister. First, given the rapid pace of tech change, what consideration will be given to ensure that public authorities can remain agile and responsive, while continuing to meet its requirements? Secondly, the five principles of AI set out in the White Paper by the previous Government offer a strong foundation for guiding public bodies. Will the Minister consider whether allowing flexibility in how these principles are observed might achieve the Bill’s goals, while reducing the administrative burdens and encouraging innovation? Thirdly, what measures will be considered to build public trust in AI systems, ensuring that the public understand both the decisions made and the safeguards in place around them?
(5 months, 3 weeks ago)
Grand CommitteeI start by thanking all noble Lords who spoke for their comments and fascinating contributions. We on these Benches share the concern of many noble Lords about the Bill allowing the use of data for research purposes, especially scientific research purposes.
Amendment 59 has, to my mind, the entirely right and important intention of preventing misuse of the scientific research exemption for data reuse by ensuring that the only purpose for which the reuse is permissible is scientific research. Clearly, there is merit in this idea, and I look forward to hearing the Minister give it due consideration.
However, there are two problems with the concept and definition of scientific research in the Bill overall, and, again, I very much look forward to hearing the Government’s view. First, I echo the important points raised by my noble friend Lord Markham. Almost nothing in research or, frankly, life more broadly, is done with only one intention. Even the most high-minded, curiosity-driven researcher will have at the back of their mind the possibility of commercialisation. Alongside protecting ourselves from the cynical misuse of science as a cover story for commercial pursuit, we have to be equally wary of creating law that pushes for the complete absence of the profit motive in research, because to the extent that we succeed in doing that, we will see less research. Secondly—the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, made this point very powerfully—I am concerned that the broad definition of scientific research in the Bill might muddy the waters further. I worry that, if the terminology itself is not tightened, restricting the exemption might serve little purpose.
On Amendment 62, to which I have put my name, the same arguments very much apply. I accept that it is very challenging to find a form of words that both encourages research and innovation and does not do so at the expense of data protection. Again, I look forward to hearing the Government’s view. I am also pleased to have signed Amendment 63, which seeks to ensure that personal data can be reused only if doing so is in the public interest. Having listened carefully to some of the arguments, I feel that the public interest test may be more fertile ground than a kind of research motivation purity test to achieve that very difficult balance.
On Amendment 64, I share the curiosity to hear how the Minister defines research and statistical processes —again, not easy but I look forward to her response.
Amendment 65 aims to ensure that research seeking to use the scientific research exemption to obtaining consent meets the minimum levels of scientific rigour. The aim of the amendment is, needless to say, excellent. We should seek to avoid creating opportunities which would allow companies—especially but not uniquely AI labs—to cloak their commercial research as scientific, thus reducing the hoops they must jump through to reuse data in their research without explicit consent. However, Amendment 66, tabled in my name, which inserts the words:
“Research considered scientific research that is carried out as a commercial activity must be subject to the approval of an independent ethics committee”,
may be a more adaptive solution.
Many of these amendments show that we are all quite aligned in what we want but that it is really challenging to codify that in writing. Therefore, the use of an ethics committee to conduct these judgments may be the more agile, adaptive solution.
I confess that I am not sure I have fully understood the mechanism behind Amendments 68 and 69, but I of course look forward to the Minister’s response. I understand that they would essentially mean consent by failing to opt out. If so, I am not sure I could get behind that.
Amendment 130 would prevent the processing of personal data for research, archiving and statistical purposes if it permits the identification of a living individual. This is a sensible precaution. It would prevent the sharing of unnecessary or irrelevant information and protect people’s privacy in the event of a data breach.
Amendment 132 appears to uphold existing patient consent for the use of their data for research, archiving and statistical purposes. I just wonder whether this is necessary. Is that not already the case?
Finally, I turn to the Clause 85 stand part notice. I listened carefully to the noble Lord, Lord Clement-Jones, but I am not, I am afraid, at a point where I can support this. There need to be safeguards on the use of data for this purpose; I feel that Clause 85 is our way of having them.
My Lords, it is a great pleasure to be here this afternoon. I look forward to what I am sure will be some excellent debates.
We have a number of debates on scientific research; it is just the way the groupings have fallen. This is just one of several groupings that will, in different ways and from different directions, probe some of these issues. I look forward to drilling down into all the implications of scientific research in the round. I should say at the beginning—the noble Lord, Lord Markham, is absolutely right about this—that we have a fantastic history of and reputation for doing R&D and scientific research in this country. We are hugely respected throughout the world. We must be careful that we do not somehow begin to demonise some of those people by casting aspersions on a lot of the very good research that is taking place.
A number of noble Lords said that they are struggling to know what the definition of “scientific research” is. A lot of scientific research is curiosity driven; it does not necessarily have an obvious outcome. People start a piece of research, either in a university or on a commercial basis, and they do not quite know where it will lead them. Then—it may be 10 or 20 years later—we begin to realise that the outcome of their research has more applications than we had ever considered in the past. That is the wonderful thing about human knowledge: as we build and we learn, we find new applications for it. So I hope that whatever we decide and agree on in this Bill does not put a dampener on that great aspect of human knowledge and the drive for further exploration, which we have seen in the UK in life sciences in particular but also in other areas such as space exploration and quantum. Noble Lords could probably identify many more areas where we are increasingly getting a reputation for being at the global forefront of this thinking. We have to take the public with us, of course, and get the balance right, but I hope we do not lose sight of the prize we could have if we get the regulations and legislation right.
Let me turn to the specifics that have been raised today. Amendments 59 and 62 to 65 relate to scientific provisions, and the noble Lord, Lord Clement-Jones, the noble Viscount, Lord Colville, and others have commented on them. I should make it clear that this Bill is not expanding the meaning of “scientific research”. If anything, it is restricting it, because the reasonableness test that has been added to the legislation—along with clarification of the requirement for research to have a lawful basis—will constrain the misuse of the existing definition. The definition is tighter, and we have attempted to do that in order to make sure that some of the new developments and technologies coming on stream will fall clearly within the constraints we are putting forward in the Bill today.
Amendments 59 and 62 seek to prevent misuse of the exceptions for data reuse. I assure the noble Viscount, Lord Colville, that the existing provisions for research purposes already prevent the controller taking advantage of them for any other purpose they may have in mind. That is controlled.
My Lords, I have to admit that I am slightly confused by the groupings at this point. It is very easy to have this debate in the medical space, to talk about the future of disease, fixing diseases and longevity, but my rather mundane questions have now gone unanswered twice. Perhaps the Minister will write to me about where the Government see scientific research on product development in some of these other spaces.
We will come back to the question of scraping and intellectual copyright, but I want to add my support to my noble friend Lord Freyberg’s amendment. I also want to add my voice to the question of the AI Bill that is coming. Data is fundamental to the AI infra- structure; data is infrastructure. I do not understand how we can have a data Bill that does not have one eye on AI, looking towards it, or how we are supposed to understand the intersection between the AI Bill and the data Bill if the Government are not more forthcoming about their intentions. At the moment, we are seeing a reduction in data protection that looks as though it is anticipating, or creating a runway for, certain sorts of companies.
Finally, I am sorry that the noble Lord is no longer in his place, but later amendments look at creating sovereign data assets around the NHS and so on, and I do not think that those of us who are arguing to make sure that it is not a free-for-all are unwilling to create, or are not interested in creating, ways in which the huge investment in the NHS and other datasets can be realised for UK plc. I do not want that to appear to be where we are starting just because we are unhappy about the roadway that Clause 67 appears to create.
Many thanks to the noble Lords who have spoken in this debate and to the noble Lord, Lord Freyberg, for his Amendment 60. Before I start, let me endorse and add my name to the request for something of a briefing about the AI Bill. I am concerned that we will put a lot of weight of expectation on that Bill. When it comes, if I understand this right, it will focus on the very largest AI labs and may not necessarily get to all the risks that we are talking about here.
Amendment 60 seeks to ensure that the Bill does not allow privately funded or commercial activities to be considered scientific research in order
“to avert the possibility that such ventures might benefit from exemptions in copyright law relating to data mining”.
This is a sensible, proportionate measure to achieve an important end, but I have some concerns about the underlying assumption, as it strikes me. There is a filtering criterion of whether or not the research is taxpayer funded; that feels like a slightly crude means of predicting the propensity to infringe copyright. I do not know where to take that so I shall leave it there for the moment.
Amendment 61 in my name would ensure that data companies cannot justify data scraping for AI training as scientific research. As many of us said in our debate on the previous group, as well as in our debate on this group, the definition of “scientific research” in the Bill is extremely broad. I very much take on board the Minister’s helpful response on that but, I must say, I continue to have some concerns about the breadth of the definition. The development of AI programs, funded privately and as part of a commercial enterprise, could be considered scientific, so I believe that this definition is far too broad, given that Article 8A(3), to be inserted by Clause 71(5), states:
“Processing of personal data for a new purpose is to be treated as processing in a manner compatible with the original purpose where … the processing is carried out … for the purposes of scientific research”.
By tightening up the definition of “scientific research” to exclude activities that are primarily commercial, it prevents companies from creating a scientific pretence for research that is wholly driven by commercial gain rather than furthering our collective knowledge. I would argue that, if we wish to allow these companies to build and train AI—we must, or others will—we must put in proper safeguards for people’s data. Data subjects should have the right to consent to their data being used in such a manner.
Amendment 65A in the name of my noble friend Lord Holmes would also take steps to remedy this concern. I believe that this amendment would work well in tangent with Amendment 61. It makes it absolutely clear that we expect AI developers to obtain consent from data subjects before they use or reuse their data for training purposes. For now, though, I shall not press my amendment.
My Lords, I share the confusion of the noble Baroness, Lady Kidron, about the groupings. If we are not careful, we are going to keep returning to this issue again and again over four or five groups.
With the possible exception of the noble Lord, Lord Lucas, I think that we are all very much on the same page here. On the suggestion from the noble Viscount, Lord Colville, that we meet to discuss the precise issue of the definition of “scientific research”, this would be extremely helpful; the noble Baroness and I do not need to repeat the concerns.
I should declare an interest in two respects: first, my interests as regards AI, which are set out on the register; and, secondly—I very much took account of what the noble Viscount, Lord Camrose, and the noble Lord, Lord Markham, had to say—I chair the council of a university that has a strong health faculty. It does a great deal of health research and a lot of that research relies on NHS datasets.
This is not some sort of Luddism we are displaying here. This is caution about the expansion of the definition of scientific research, so that it does not turn into something else: that it does not deprive copyright holders of compensation, and that it does not allow personal data to be scraped off the internet without consent. There are very legitimate issues being addressed here, despite the fact that many of us believe that this valuable data should of course be used for the public benefit.
One of the key themes—this is perhaps where we come back on to the same page as the noble Lord, Lord Lucas—may be public benefit, which we need to reintroduce so that we really understand that scientific research for public benefit is the purpose we want this data used for.
I do not think I need to say much more: this issue is already permeating our discussions. It is interesting that we did not get on to it in a major way during the DPDI Bill, yet this time we have focused much more heavily on it. Clearly, in opposition, the noble Viscount has seen the light. What is not to like about that? Further discussion, not least of the amendment of the noble Baroness, Lady Kidron, further down the track will be extremely useful.
My Lords, Amendments 66, 67 and 80 in this group are all tabled in my name. Amendment 66 requires scientific research carried out for commercial purposes to
“be subject to the approval of an independent ethics committee”.
Commercial research is, perhaps counterintuitively, generally subjected to fewer ethical safeguards than research carried out purely for scientific endeavour by educational institutions. Given the current broad definition of scientific research in the Bill—I am sorry to repeat this—which includes research for commercial purposes, and the lower bar for obtaining consent for data reuse should the research be considered scientific, I think it would be fair to require more substantial ethical safeguards on such activities.
We do not want to create a scenario where unscrupulous tech developers use the Bill to harvest significant quantities of personal data under the guise of scientific endeavour to develop their products, without having to obtain consent from data subjects or even without them knowing. An independent ethics committee would be an excellent way to monitor scientific research that would be part of commercial activities, without capping data access for scientific research, which aims more purely to expand the horizon of our knowledge and benefit society. Let us be clear: commercial research makes a huge and critically important contribution to scientific research, but it is also surely fair to subject it to the same safeguards and scrutiny required of non-commercial scientific research.
Amendment 67 would ensure that data controllers cannot gain consent for research purposes that cannot be defined at the time of data collection. As the Bill stands, consent will be considered obtained for the purposes of scientific research if, at the time consent is sought, it is not possible to identify fully the purposes for which the personal data is to be processed. I fully understand that there needs to be some scope to take advantage of research opportunities that are not always foreseeable at the start of studies, particularly multi-year longitudinal studies, but which emerge as such studies continue. I am concerned, however, that the current provisions are a little too broad. In other words: is consent not actually being given at the start of the process for, effectively, any future purpose?
Amendment 80 would prevent the data reuse test being automatically passed if the reuse is for scientific purposes. Again, I have tabled this amendment due to my concerns that research which is part of commercial activities could be artificially classed as scientific, and that other clauses in the Bill would therefore allow too broad a scope for data harvesting. I beg to move.
My Lords, it seems very strange indeed that Amendment 66 is in a different group from group 1, which we have already discussed. Of course, I support Amendment 66 from the noble Viscount, Lord Camrose, but in response to my suggestion for a similar ethical threshold, the Minister said she was concerned that scientific research would find this to be too bureaucratic a hurdle. She and many of us here sat through debates on the Online Safety Bill, now an Act. I was also on the Communications Committee when it looked at digital regulations and came forward with one of the original reports on this. The dynamic and impetus which drove us to worry about this was the lack of ethics within the tech companies and social media. Why on earth would we want to unleash some of the most powerful companies in the world on reusing people’s data for scientific purposes if we were not going to have an ethical threshold involved in such an Act? It is important that we consider that extremely seriously.
I am not quite sure about the groupings, either, but let us go with what we have. I thank noble Lords who have spoken, and the noble Viscount, Lord Camrose, for his amendments. I hope I am able to provide some reassurance for him on the points he raised.
As I said when considering the previous group, the Bill does not expand the definition of scientific research. The reasonableness test, along with clarifying the requirement for researchers to have a lawful basis, will significantly reduce the misuse of the existing definition. The amendment seeks to reduce the potential for misuse of the definition of scientific research by commercial companies using AI by requiring scientific researchers for a commercial company to submit their research to an ethics committee. As I said on the previous group, making it a mandatory requirement for all research may impede studies in areas that might have their own bespoke ethical procedures. This may well be the case in a whole range of different research areas, particularly in the university sector, and in sectors more widely. Some of this research may be very small to begin with but might grow in size. The idea that a small piece of start-up research has to be cleared for ethical research at an early stage is expecting too much and will put off a lot of the new innovations that might otherwise come forward.
Amendment 80 relates to Clause 71 and the reuse of personal data. This would put at risk valuable research that relies on data originally generated from diverse contexts, since the difference between the purposes may not always be compatible.
Turning to Amendment 67, I can reassure noble Lords that the concept of broad consent is not new. Clause 68 reproduces the text from the current UK GDPR recitals because the precise definition of scientific research may become clear only during later analysis of the data. Obtaining broad consent for an area of research from the outset allows scientists to focus on potentially life-saving research. Clause 68 has important limitations. It cannot be used if the researcher already knows the specific purpose—an important safeguard that should not be removed. It also includes a requirement to give the data subject the choice to consent to only part of the research processing, if possible. Most importantly, the data subject can revoke their consent at any point. I hope this reassures the noble Viscount, Lord Camrose, and he feels content to withdraw his amendment on this basis.
I thank the noble Viscount, Lord Colville, and the noble Lord, Lord Clement-Jones, for their remarks and support, and the Minister for her helpful response. Just over 70% of scientific research in the UK is privately funded, 28% is taxpayer funded and around 1% comes through the charity sector. Perhaps the two most consequential scientific breakthroughs of the last five years, Covid vaccines and large language models, have come principally from private funding.
My Lords, I support these amendments in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. It is a pleasure to follow the second ex-Health Minister this afternoon. In many ways, the arguments are just the same for health data as they are for all data. It is just that, understandably, it is at the sharpest end of this debate. Probably the most important point for everybody to realise, although it is espoused so often, is that there is no such thing as NHS data. It is a collection of the data of every citizen in this country, and it matters. Public trust matters significantly for all data but for health data in particular, because it goes so close to our identity—our very being.
Yet we know how to do public trust in this country. We know how to engage and have had significant success in public engagement decades ago. What we could do now with human-led technology-supported public engagement could be on such a positive and transformational scale. But, so far, there has been so little on this front. Let us not talk of NHS data; let us always come back to the fundamental principle encapsulated in this group of amendments and across so many of our discussions on the Bill. Does the Minister agree that it is about not NHS data but our data—our decisions—and, through that, if we get it right, our human-led digital futures?
Many thanks to all noble Lords who have proposed and supported these amendments. I will speak to just a few of them.
Amendment 70 looks to mitigate the lowering of the consent threshold for scientific research. As I have set out on previous groups, I too have concerns about that consent threshold. However, for me the issue is more with the definition of scientific research than with the consent threshold, so I am not yet confident that the amendment is the right way to achieve those desirable aims.
Amendment 71 would require that no NHS personal data can be made available for scientific research without the explicit consent of the patient. I thank the noble Lords, Lord Stevenson of Balmacara and Lord Clement-Jones, for raising this because it is such an important matter. While we will discuss this under other levels, as the noble Baroness, Lady Kidron, points out, it is such an important thing and we need to get it right.
I regret to advise my noble friend Lord Holmes that I was going to start my next sentence with the words “Our NHS data”, but I will not. The data previously referred to is a very significant and globally unique national asset, comprising many decades of population-wide, cradle-to-grave medical data. No equivalent at anything like the same scale or richness exists anywhere, which makes it incredibly valuable. I thank my noble friend Lord Kamall for stressing this point with, as ever, the help of Jimi Hendrix.
However, that data is valuable only to the extent that it can be safely exploited for research and development purposes. The data can collectively help us develop new medicines or improve the administration and productivity of the NHS, but we need to allow it to do so properly. I am concerned that this amendment, if enacted, would create too high an operational and administrative barrier to the safe exploitation of this data. I have no interest in compromising on the safety, but we have to find a more efficient and effective way of doing it.
Amendments 79, 81 and 131 all look to clarify that the definition of consent to be used is in line with the definition in Article 4.11 of the UK GDPR:
“‘consent’ of the data subject means any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her”.
This amendment would continue the use of a definition that is well understood. However, paragraph 3(a) of new Article 8A appears sufficient, in that the purpose for which a data subject consents is “specified, explicit and legitimate”.
Finally, with respect to Clause 77 stand part, I take the point and believe that we will be spending a lot of time on these matters going forward. But, on balance and for the time being, I feel that this clause needs to remain, as there must be clear rules on what information should be provided to data subjects. We should leave it in for now, although we will no doubt be looking to polish it considerably.
My Lords, I want to ask the Minister and the noble Lord, Lord Clement-Jones, in very general terms for their views on retrospectivity. Do they believe that the changes to data protection law in the Bill are intended to be applied to data already held at this time or will the new regime apply only to personal data collected going forwards from this point? I ask that specifically of data pertaining to children, from whom sensitive data has already been collected. Will the forthcoming changes to data protection law apply to such data that controllers and processors already hold, or will it apply only to data held going forward?
I thank in particular the noble Lord, Lord Clement-Jones, who has clearly had his Weetabix this morning. I will comment on some of the many amendments tabled.
On Amendments 73, 75, 76, 77, 83 and 90, I agree it is concerning that the Secretary of State can amend such important legislation via secondary legislation. However, these amendments are subject to the affirmative procedure and, therefore, to parliamentary scrutiny. Since the DPDI Bill proposed the same, I have not changed my views; I remain content that this is the right level of oversight and that these changes do not need to be made via primary legislation.
As for Amendment 74, preventing personal health data from being considered a legitimate interest seems wise. It is best to err on the side of caution when it comes to sharing personal health data.
Amendment 77 poses an interesting suggestion, allowing businesses affiliated by contract to be treated in the same way as large businesses that handle data from multiple companies in a group. This would certainly be beneficial for SMEs collaborating on a larger project. However, each such business may have different data protection structures and terms of use. Therefore, while this idea certainly has merit, I am a little concerned that it may benefit from some refining to ensure that the data flows between businesses in a way to which the data subject has consented.
On Amendment 78A and Schedule 4 standing part, there are many good, legitimate interest reasons why data must be quickly shared and processed, many of which are set out in Schedule 4: for example, national security, emergencies, crimes and safeguarding. This schedule should therefore be included in the Bill to set out the details on these important areas of legitimate interest processing. Amendment 84 feels rather like the central theme of all our deliberations thus far today, so I will listen with great interest, as ever, to the Minister’s response.
I have some concerns about Amendment 85, especially the use of the word “publicly”. The information that may be processed for the purposes of safeguarding vulnerable individuals is likely to be deeply sensitive and should not be publicly available. Following on from this point, I am curious to hear the Minister’s response to Amendment 86. It certainly seems logical that provisions should be in place so that individuals can regain control of their personal data should the reason for their vulnerability be resolved. As for the remaining stand part notices in this group, I do not feel that these schedules should be removed because they set out important detail on which we will come to rely.
My Lords, I think we sometimes forget, because the results are often so spectacular, the hard work that has had to happen over the years to get us to where we are, particularly in relation to the Online Safety Act. It is well exemplified by the previous speaker. He put his finger on the right spot in saying that we all owe considerable respect for the work of the noble Baroness, Lady Kidron, and others. I helped a little along the way. It is extraordinary to feel that so much of this could be washed away if the Bill goes forward in its present form. I give notice that I intend to work with my colleagues on this issue because this Bill is in serious need of revision. These amendments are part of that and may need to be amplified in later stages.
I managed to sign only two of the amendments in this group. I am sorry that I did not sign the others, because they are also important. I apologise to the noble Lord, Lord Clement-Jones, for not spotting them early enough to be able to do so. I will speak to the ones I have signed, Amendments 88 and 135. I hope that the Minister will give us some hope that we will be able to see some movement on this.
The noble Lord, Lord Russell, mentioned the way in which the wording on page 113 seems not only to miss the point but to devalue the possibility of seeing protections for children well placed in the legislation. New Clause 120B(e), which talks of
“the fact that children may be less aware of the risks and consequences associated with processing of personal data and of their rights in relation to such processing”,
almost says it all for me. I do not understand how that could possibly have got through the process by which this came forward, but it seems to speak to a lack of communication between parts of government that I hoped this new Government, with their energy, would have been able to overcome. It speaks to the fact that we need to keep an eye on both sides of the equation: what is happening in the online safety world and how data that is under the control of others, not necessarily those same companies, will be processed in support or otherwise of those who might wish to behave in an improper or illegal way towards children.
At the very least, what is in these amendments needs to be brought into the Bill. In fact, other additions may need to be made. I shall certainly keep my eye on it.
My Lords, I thank the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Kidron, for bringing forward amendments in what is a profoundly important group. For all that data is a cornerstone of innovation and development, as we have often argued in this Committee, we cannot lose sight of our responsibility to safeguard the rights and welfare of our children.
I start by speaking to two amendments tabled in my name.
Amendment 91 seeks to change
“the definition of request by data subjects to data controllers”
that can be declined or
“for which a fee can be charged from ‘manifestly unfounded or excessive’ to ‘vexatious or excessive’”.
I am sure that many of us will remember, without a great deal of fondness, our debates on these terms in the DPDI Bill. When we debated this issue at that time, it was, rather to my regret, often presented as a way to reduce protections and make it easier to decline or charge a fee for a subject access request. In fact, the purpose was to try to filter out cynical or time-wasting requests, such as attempts to bypass legal due process or to bombard organisations with vast quantities of essentially meaningless access requests. Such requests are not unfounded but they are harmful; by reducing them, we would give organisations more time and capacity to respond to well-founded requests. I realise that I am probably on a loser on this one but let me encourage noble Lords one last time to reconsider their objections and take a walk on the vexatious side.
Amendment 97 would ensure that
“AI companies who process data not directly obtained from data subjects are required to provide information to data subjects where possible. Without this amendment, data subjects may not know their data is being held”.
If a subject does not even know that their data is being held, they cannot enforce their data rights.
Amendment 99 follows on from that point, seeking to ensure that AI companies using large datasets cannot avoid providing information to data subjects on the basis that their datasets are too large. Again, if a subject does not know that their data is being held, they cannot enforce their rights. Therefore, it is really important that companies cannot avoid telling individuals about their personal data and the way in which it is being used because of sheer weight of information. These organisations are specialists in such processing of huge volumes of data, of course, so I struggle to accept that this would be too technically demanding for them.
Let me make just a few comments on other amendments tabled by noble Lords. Under Amendment 107, the Secretary of State would have
“to publish guidance within six months of the Act’s passing to clarify what constitutes ‘reasonable and proportionate’ in protection of personal data”.
I feel that this information should be published at the same time as this Bill comes into effect. It serves no purpose to have six months of uncertainty.
I do not believe that Amendment 125 is necessary. The degree to which the Government wish to align—or not—with the EU is surely a matter for the Government and their priorities.
Finally, I was struck by the interesting point that the noble and learned Lord, Lord Thomas, made when he deplored the Bill’s incomprehensibility. I have extremely high levels of personal sympathy with that view. To me, the Bill is the source code. There is a challenge in making it comprehensible and communicating it in a much more accessible way once it goes live. Perhaps the Minister can give some thought to how that implementation phase could include strong elements of communication. While that does not make the Bill any easier to understand for us, it might help the public at large.
My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.