Data (Use and Access) Bill [HL] Debate
Full Debate: Read Full DebateLord Freyberg
Main Page: Lord Freyberg (Crossbench - Excepted Hereditary)Department Debates - View all Lord Freyberg's debates with the Department for Business and Trade
(5 days, 9 hours ago)
Lords ChamberMy Lords, it is a great pleasure to follow the noble Lord, Lord Davies, and what he had to say on health data, much of which I agree entirely with. The public demand that we get this right and we really must endeavour to do all we can to reassure the public in this area.
I speak as someone deeply rooted in the visual arts and as an artist member of DACS—the Design and Artists Copyright Society. In declaring my interests, I also express gratitude for the helpful briefing provided by DACS.
The former Data Protection and Digital Information Bill returns to this House after its journey was interrupted by July’s general election. While this renewed Bill’s core provisions remain largely unchanged, the context in which we examine them has shifted significantly. The rapid advancements in artificial intelligence compel us to scrutinise this legislation not just for its immediate impact but for its long-term consequences. Our choices today will shape how effectively we safeguard the rights and interests of our citizens in an increasingly digital society. For this reason, the Bill demands meticulous and thorough examination to ensure that it establishes a robust governance framework capable of meeting present and future challenges.
Over the past year, Members of this House have carefully considered the opportunities and risks of large language models which power artificial intelligence applications—work that is still ongoing. I note that even today, the Lords Communications and Digital Committee, chaired by the noble Baroness, Lady Stowell of Beeston, is holding an evidence session on the role of AI in creative tech.
The committee’s previous inquiry into large language models stressed a need for cautious action. Drawing on expert testimony, its recommendations highlighted critical gaps in our current approach, particularly in addressing immediate risks in areas such as cybersecurity, counterterrorism, child protection, and disinformation. The committee rightly stressed the need for stronger assessments and guardrails to mitigate these harms, including in the area of data protection.
Regrettably, however, this Bill moves in the opposite direction, and instead seeks to lighten the regulatory governance of data processing and relaxes rules around automated decision-making, as other noble Lords have referred to. Such an approach risks leaving our legislative framework ill prepared to address the potential risks that our own committee has so carefully documented.
The creative industries, which contribute £126 billion annually to the UK economy, stand particularly exposed. Evidence submitted to the committee documented systematic unauthorised use of copyrighted works by large language models, which harvest content across the internet while circumventing established licensing frameworks and creator permissions.
This threat particularly impacts visual artists—photographers, illustrators, designers, et cetera—many of whom already earn far below the minimum wage, as others, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Bassam and Lord Holmes, have already highlighted. These creators now confront a stark reality: AI systems can instantaneously generate derivative works that mimic their distinctive styles and techniques, all without attribution or compensation. This is not merely a theoretical concern; this technological displacement is actively eroding creative professionals’ livelihoods, with documented impacts on commission rates and licensing revenues.
Furthermore, the unauthorised use of reliable, trusted data, whether from reputable news outlets or authoritative individuals, fuels the spread of disinformation. These challenges require a solution that enables individuals and entities, such as news publishers, to meaningfully authorise and license their works for a fair fee.
This Bill not only fails to address these fundamental challenges but actively weakens existing protections. Most alarmingly, it removes vital transparency requirements for personal data, including data relating to individual creators, when used for research, archival and statistical purposes. Simultaneously, it broadens the definition of research to encompass “commercial” activities, effectively creating a loophole ripe for exploitation by profit-driven entities at the expense of individual privacy and creative rights.
Finally, a particularly troubling aspect of the Bill is its proposal to dissolve the Information Commissioner’s Office in favour of an information commission—a change that goes far beyond mere restructuring. Although I heard what the Minister said on this, by vesting the Secretary of State with sweeping powers to appoint key commission members, the Bill threatens to compromise the fundamental independence that has long characterised our data protection oversight. Such centralised political influence could severely undermine the commission’s ability to make impartial, evidence-based decisions, particularly when regulating AI companies with close government ties or addressing sensitive matters of national interest. This erosion of regulatory independence should concern us all.
In summary, the cumulative effect of this Bill’s provisions exposes a profound mismatch between the protections our society urgently needs and those this legislation would actually deliver. At a time when artificial intelligence poses unprecedented challenges to personal privacy and creative rights, this legislation, although positive on many fronts, appears worryingly inadequate.