(1 year, 3 months ago)
Grand CommitteeMy Lords, I have tabled Amendment 60 to add to our discussion and establish some further clarity from the Minister on the impact of widening the scope of the interpretation of scientific research to include commercial and private activities. I thank her for her letter of 27 November to all noble Lords who spoke at Second Reading, a copy of which was placed in the Lords Library; it provides some reassurance that scientific research activities must still pass a reasonableness test. However, I move this probing amendment out of concern that the change in definition may have unintended consequences for copyright law. It is vital that we do not just look at this Bill in isolation but consider the wider impact that changing definitions and interpretations will have on other aspects of legislation.
Research activities are identified under the Copyright, Designs and Patents Act 1988. Some researchers require access to and reproduction of data and copyright-protected material for research purposes. Under Section 29A, researchers can avail themselves of an exemption from copyright which allows data mining and analysis of copyright-protected works for non-commercial research only, without permission from the copyright holder. The UK copyright framework is popularly known as the “gold standard” internationally, as it carefully balances the rights of copyright holders with the need for certain uses to take place, such as non-commercial research, educational uses and those that protect free speech. That balance is fragile, and we must be very careful not to disrupt it unintentionally.
The previous Government sought to widen Section 29A of the Act by allowing text and data mining of copyright-protected works for commercial purposes, but this recommendation was quickly reversed when the Government considered that the decision was made without appropriate evidence. That was a sensible move. The current Government are still due to consult with stakeholders on the exemption to the law, against the backdrop of AI companies using copyright-protected works for training large language models without permission or fair pay. Given the global presence of AI, it is expected that this consultation will consider how the UK policy on copyright works within an international context. Therefore, while the Government are carefully considering this, we must ensure that we do not fast forward to a conclusion before that important work has taken place.
If the Minister can confirm that this definition has no impact on existing copyright law, I will happily withdraw this amendment. However, if there are potential implications on the Copyright, Designs and Patents Act 1988, I would urge the Minister to table her own amendment to explicitly preserve the current definition of “scientific research” within that Act. This would ensure that we maintain legal clarity while the broader international considerations are fully examined. I beg to move.
I advise the Committee that, if this amendment is agreed, I cannot call Amendment 61 by reason of pre-emption.
Let me put it this way: other things may be coming before it. I think I promised at the last debate that we would have something on copyright in the very, very, very near future. This may not be as very, very, very near future as that. We will tie ourselves in knots if we carry on pursuing this discussion.
On that basis, I hope that this provides noble Lords with sufficient reassurance not to press their amendments.
I thank your Lordships for this interesting debate. I apologise to the Committee for degrouping the amendment on copyright, but I thought it was important to establish from the Minister that there really was no effect on the copyright Act. I am very reassured that she has said that. It is also reassuring to hear that there will be more of an opportunity to look at this issue in greater detail. On that basis, I beg leave to withdraw the amendment.
(1 year, 4 months ago)
Lords ChamberMy Lords, it is a great pleasure to follow the noble Lord, Lord Davies, and what he had to say on health data, much of which I agree entirely with. The public demand that we get this right and we really must endeavour to do all we can to reassure the public in this area.
I speak as someone deeply rooted in the visual arts and as an artist member of DACS—the Design and Artists Copyright Society. In declaring my interests, I also express gratitude for the helpful briefing provided by DACS.
The former Data Protection and Digital Information Bill returns to this House after its journey was interrupted by July’s general election. While this renewed Bill’s core provisions remain largely unchanged, the context in which we examine them has shifted significantly. The rapid advancements in artificial intelligence compel us to scrutinise this legislation not just for its immediate impact but for its long-term consequences. Our choices today will shape how effectively we safeguard the rights and interests of our citizens in an increasingly digital society. For this reason, the Bill demands meticulous and thorough examination to ensure that it establishes a robust governance framework capable of meeting present and future challenges.
Over the past year, Members of this House have carefully considered the opportunities and risks of large language models which power artificial intelligence applications—work that is still ongoing. I note that even today, the Lords Communications and Digital Committee, chaired by the noble Baroness, Lady Stowell of Beeston, is holding an evidence session on the role of AI in creative tech.
The committee’s previous inquiry into large language models stressed a need for cautious action. Drawing on expert testimony, its recommendations highlighted critical gaps in our current approach, particularly in addressing immediate risks in areas such as cybersecurity, counterterrorism, child protection, and disinformation. The committee rightly stressed the need for stronger assessments and guardrails to mitigate these harms, including in the area of data protection.
Regrettably, however, this Bill moves in the opposite direction, and instead seeks to lighten the regulatory governance of data processing and relaxes rules around automated decision-making, as other noble Lords have referred to. Such an approach risks leaving our legislative framework ill prepared to address the potential risks that our own committee has so carefully documented.
The creative industries, which contribute £126 billion annually to the UK economy, stand particularly exposed. Evidence submitted to the committee documented systematic unauthorised use of copyrighted works by large language models, which harvest content across the internet while circumventing established licensing frameworks and creator permissions.
This threat particularly impacts visual artists—photographers, illustrators, designers, et cetera—many of whom already earn far below the minimum wage, as others, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Bassam and Lord Holmes, have already highlighted. These creators now confront a stark reality: AI systems can instantaneously generate derivative works that mimic their distinctive styles and techniques, all without attribution or compensation. This is not merely a theoretical concern; this technological displacement is actively eroding creative professionals’ livelihoods, with documented impacts on commission rates and licensing revenues.
Furthermore, the unauthorised use of reliable, trusted data, whether from reputable news outlets or authoritative individuals, fuels the spread of disinformation. These challenges require a solution that enables individuals and entities, such as news publishers, to meaningfully authorise and license their works for a fair fee.
This Bill not only fails to address these fundamental challenges but actively weakens existing protections. Most alarmingly, it removes vital transparency requirements for personal data, including data relating to individual creators, when used for research, archival and statistical purposes. Simultaneously, it broadens the definition of research to encompass “commercial” activities, effectively creating a loophole ripe for exploitation by profit-driven entities at the expense of individual privacy and creative rights.
Finally, a particularly troubling aspect of the Bill is its proposal to dissolve the Information Commissioner’s Office in favour of an information commission—a change that goes far beyond mere restructuring. Although I heard what the Minister said on this, by vesting the Secretary of State with sweeping powers to appoint key commission members, the Bill threatens to compromise the fundamental independence that has long characterised our data protection oversight. Such centralised political influence could severely undermine the commission’s ability to make impartial, evidence-based decisions, particularly when regulating AI companies with close government ties or addressing sensitive matters of national interest. This erosion of regulatory independence should concern us all.
In summary, the cumulative effect of this Bill’s provisions exposes a profound mismatch between the protections our society urgently needs and those this legislation would actually deliver. At a time when artificial intelligence poses unprecedented challenges to personal privacy and creative rights, this legislation, although positive on many fronts, appears worryingly inadequate.