Creative Industries: Rights Reservation Model

Thursday 30th January 2025

(1 day, 12 hours ago)

Grand Committee
Read Hansard Text Read Debate Ministerial Extracts
Question for Short Debate
14:00
Asked by
Lord Foster of Bath Portrait Lord Foster of Bath
- Hansard - - - Excerpts

To ask His Majesty’s Government what assessment they have made of the impact on creators and the creative industries of the rights reservation model proposed in their consultation paper “Copyright and Artificial Intelligence” published on 17 December 2024.

Lord Foster of Bath Portrait Lord Foster of Bath (LD)
- Hansard - - - Excerpts

My Lords, I begin by thanking the Minister for meeting me earlier in the week. I accept that he wants genuine consultation, although, as I will come to in a moment, the Government appear, in some aspects of the consultation, to have predetermined the direction of travel.

I accept that it seems somewhat odd to be returning to this issue so soon after Tuesday’s debate and the passing of the amendments of the noble Baroness, Lady Kidron, who deserves great praise for the work she has been doing. In my defence, I point out that I entered the ballot before the date had been set for Report stage of the Bill. Anyway, I am absolutely confident that we will need to keep returning to this issue many times, given the pace of development in AI. It is worth reflecting that on Tuesday evening, we were only just learning of the allegations by large AI firms that DeepSeek had been freeloading off their models to train its own model—an infringement, they claim, of their IP. How they have the gall to say that is beyond me, frankly, given that they have been responsible for the theft on a grand scale of the IP of UK creators.

I am not trying to join the debate retrospectively, but I must make my own position clear: the theft I have described must stop. I supported the noble Baroness’s amendments. The enormous success of our creative industries, much lauded on Tuesday, is in no small measure due to our gold standard IP regime, which has been the bedrock of growth, investment and innovation. We weaken it at our peril.

I want to make three things clear. First, this is not an IP v AI debate. The creative industries have been early adopters of AI and see the real benefits of its development. Indeed, they have been working with AI developers as fellow travellers. But if UK creators, many already poorly paid, are to earn even less because their IP is not remunerated, they will stop creating and so stop the flow of the high-class data needed for AI development. As Sir Paul McCartney said over the weekend,

“make sure you protect the creative thinkers, the creative artists, or you’re not going to have them”.

Secondly, I am unconvinced by arguments of legal uncertainty. Rather than sowing seeds of doubt, I would prefer to see the Government supporting the creative industries in upholding the law against unprecedented theft of their IP. As the noble Baroness, Lady Cavendish, said on Tuesday:

“This is not about balance”


between AI and IP,

“it is about implementing and upholding the rule of law ”.—[Official Report, 28/1/25; col. 167.]

But I also accept the need, as has frequently happened in the light of technological development, to update legislation, not least in terms of transparency and enforcement.

Thirdly, any updating should be based on detailed assessment of the implications. The question for this debate is:

“To ask His Majesty’s Government what assessment they have made of the impact on creators and the creative industries of the rights reservation model proposed in their”


AI consultation paper. Sadly, Tuesday’s debate made it clear that the answer is, little or none.

The creative industries’ own assessment argues that the Government’s proposed option of a text and data-mining exception will weaken our gold-standard IP regime. They argue that it could mean that AI companies, most of which are large US tech firms, can effectively take British creators’ work to train their models, profit from it and, in many cases, not repay the creator. Bizarrely, having circumvented the IP protection of others, the AI companies can get IP protection for their own creations.

But instead of rehashing the debate, I want to offer the Minister an opportunity to give reassurances to this Committee and the creative community that the Government are listening to the concerns, and to offer further comments that Members in another place will read before the Bill is debated there—where, incidentally, I hope we might see a shift in the Official Opposition’s position.

The Minister and his colleagues in the other place have been keen to reassure us that any new TDM exception with opt-out or rights reservation would be introduced only once a workable opt-out was found. He and his ministerial colleagues must therefore have some confidence that these systems are at least emerging, so what examples can he provide? To get to a stage of actively promoting a particular option for reform, one must assume that the Government have received assurances that, if that option is in place, AI developers will proactively enter licence agreements for content. Can the Minister say whether such assurances have been received?

Ministers have also accepted that different types of work will need different systems of opt-out. Is the thinking that there will be a phased approach to the introduction, as each different system is agreed? How could that possibly work? Will the Minister offer reassurance that this will not lead to different works having different levels of copyright protection?

The Government have said that any system must be workable. How will that be assessed? On Tuesday, the Minister in the other place said before the DCMS Committee that it would not be a decision just for Ministers; rather, it would be one for them and industry. Can the Minister shed some light on how such a decision on workability might be agreed, and, in particular, give a categorical assurance that rights holders will have a formal role in approval?

The Minister in the other place also talked about the need for ease and accessibility in any new system. The creative industries have argued that the Government’s preferred option would create huge bureaucratic burdens for artists, particularly independent artists and small music labels, who would end up wasting hundreds of hours on paperwork and translating legal jargon rather than, for example, making music or writing books. Can the Minister explain what “easy” and “accessible” look like?

On other areas of the consultation there is more widespread agreement about the need for updating legislation. For example, some AI developers have publicly claimed that they can use temporary copying exemptions as a legal basis for using data for model training without paying. Will the Minister confirm that this is not intended and will be clarified in law? There are strong arguments in favour of changes around metadata, with legislation prohibiting the stripping of rights reservation protocols to help better protect so-called floating content. Again, will such prohibitions be included in any changes to the law?

I know that the Minister agrees on the need for far greater transparency, and the consultation contains proposals to implement some form of transparency mechanism for AI developers to follow, but does he agree that, to be effective, it will need to be transparency that provides a granular level of detail of the works that have been ingested? Without it, there will be no way for those developers to prove compliance with any opt-out. Does he also agree that developers should be required to provide details of the crawlers they have used, coupled with an assurance that the crawlers have been designed to interpret and respect machine-readable rights reservation notices?

On enforcement, transparency will only help to provide evidence of compliance—or non-compliance—with the law. It will not offer a route for creators to receive any form of compensation for the misuse of their works. If those rights holders have to go to court to receive any compensation, how will that move us on from where we are today? Again, the Minister in the other place told the DCMS Committee that he did not think that accessing justice through the courts should be the preserve of deep-pocketed rights holders. I agree. Can the Minister suggest how the Government foresee rights holders being able to access justice, if not through the courts?

Finally, little has been said about how any new law will co-exist with the laws we have now. Will the Minister confirm that any existing infringement would have to be dealt with under existing law? There are understandable concerns among our talented UK creators. I hope that when he responds the Minister will acknowledge those concerns and, in some areas at least, provide some assurances, not least a willingness to reconsider the potentially hugely damaging proposal for a new text and data-mining exception. Without it, we risk sacrificing a known success story—the UK’s £124 billion creative industries—for a leap in the dark.

14:10
Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, I want to address the impact of the Government’s proposed rights reservation model on the media. I declare my interest as deputy chairman of the Telegraph Media Group and note my other interests in the register. I congratulate the noble Lord, Lord Foster of Bath, on securing this debate and on his powerful speech. Hard on the heels of Tuesday’s vote on the data Bill, it presents another opportunity to send a powerful signal that the Government’s preferred option for an opt-out model is deeply flawed and would profoundly damage the whole creative economy.

The crushing onslaught of digital media has impacted every aspect of life but nowhere more acutely than on the media, as advertising revenues, which support quality journalism, have haemorrhaged to the giant, unaccountable tech platforms. The UK’s advertising market was worth more than £36 billion in 2023, but £14 billion of that went direct to Google’s search service alone. By contrast, less than 4% of the value of the entire ad market—yes, 4%—went to news publishers. Media businesses have therefore been in a race against time to find a new business model, but just when many are so successfully doing so, the exponential growth of AI has brought huge new challenges with it, and this proposal will turbocharge that.

The reality of the current media landscape was set out recently by the Economist, which noted that social media had transformed the market by reducing the cost of the distribution of news to zero, and now AI is going to do the same by potentially reducing the cost of generating so-called news to zero. Between the two of them, we are being led into TS Eliot’s “wilderness of mirrors”, where it is impossible to tell the difference between truth and illusion, with profound ramifications for our democracy.

In some ways, it is impossible to reach conclusions about the Government’s opt-out model, as we know so little about it. There has been no impact assessment, and there must be. It is entirely untested and unevidenced, and we cannot learn from other jurisdictions because a working rights reservation regime does not exist anywhere else on the planet. Given the enormous repercussions of this, there must be clarity—and none exists, but of some things we can be certain.

One is that were it even possible to produce a practical and effective opt-out mechanism, and I have severe doubts about that, it would place an immense administrative burden and therefore unsustainable cost on even the largest news publishers. Already, more than 40% of the top 100 English-language news websites do not block any AI crawlers, and they are the ones that have the knowledge and resources to do so. Smaller news publishers, including hard-pressed local media or a freelancer writing on their Substack, simply would not stand a chance. One other point on which we can be certain is that while these proposals may seem attractive to big tech in the short term, over the long term they could end up significantly weakening AI and, as the noble Lord said, we are all pro AI. It has enormous potential but it must be done and dealt with properly.

The problems for AI will spring because it is totally reliant on large volumes of high-quality data. It needs a sustainable and fresh supply to function—something that is especially true for search engines such as Google’s AI Overviews, which rely on retrieval-augmented generation and feed off up-to-date news content to provide accurate, relevant information. Yet researchers predict that, if current trends continue, AI developers will deplete the available stock of public, human-created text data sometime between 2026 and 2032. It will inevitably be replaced by what? By AI-generated content—in other words, it will feed off itself in a way which will degrade the quality of large language models, as they begin to rely on their own inferior data. It would become a modern-day version of the fabled Greek king Erysichthon, whose hunger—forced on him by the goddess Demeter, I am told—was so insatiable that he squandered his entire fortune and ended up eating himself. That is what could happen with AI.

It need not be like this. There is a way forward that will allow both AI and the original content creators to flourish together: simply by ensuring that the existing copyright laws we have are properly and transparently enforced, with effective mechanisms to build a dynamic licensing market. This would be in the interests not just of content creators, who are so desperate for change after years of copyright theft by the GAI firms; of the public, who overwhelmingly believe that these companies should pay to use the content that trains them; or of the media, whose quality journalism is absolutely vital for our democracy. It would, as I have said, be in the long-term interests of AI, too. If the Government really want to make the UK an AI powerhouse and protect our creative industries, which are the envy of the world and will power growth in future, they must think again. I look forward to hearing from the Minister.

14:17
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow my noble friend Lord Black. I congratulate the noble Lord, Lord Foster, on securing this timely and excellent debate. In doing so, I declare my interests as set out in the register—in particular my technology interests, not least as an adviser to Socially Recruited, which is an AI business.

As the noble Lord, Lord Foster, has already set out, we had an excellent debate on Tuesday night. My question for this afternoon is: how much does it cost to develop and train a foundation model? Is it £500 billion or £5 million? Is it somewhere in between? I do not know, but here is what we do know. The cost of current foundational models is felt by our creatives: the musicians who make sounds where there would otherwise be silence; and the writers who fill a blank page with words that touch our human hearts and souls and, sometimes, change the course of human history. They are paying the cost of the current “model” that we have.

How can it be not only that they are currently footing the cost but that the potential, proposed approach to this issue will put the onus on them to assert their rights? There is that onus, the cost, pressure and stress and, ultimately, the impossibility of doing this with an opt-out model. My first question to the Minister is: can it ever be so that opting out could work? How could it ever bring the certainty, clarity and consistency that we require? As a helpful example, can the Minister say something about the recent LAION case and the light that that throws on this matter?

There is a real tedium to this TDM discussion. It is just that an obvious and irrefutable truth is wilfully ignored and pushed to one side. If you own a copyright or have IP rights, you hold and own those rights. If you do not, the truth is simple and unquestionable: those rights are not yours. That should be the guiding principle when considering any potential approach to IP and copyright in relation not just to AI but to the fact that we have hundreds of years of legal certainty which comes from this.

How would the Minister define a proper and workable model for the preservation of these rights? What would he say to individuals and small entities about the cost, pressure and impossibility of seeking to enforce their rights? How does he intend transparency to be an important thread that runs through this alongside the technical? What about post-ingestion and, if we get to the point of some potential change, what about all that protected material already ingested deep into the engine room of these models?

What attracts businesses, investors and innovators to the UK from a regulatory and legislative perspective? It is certainty, clarity and consistency. In no sense can we say that we have those right now in our country. That is why I believe, not only when it comes to IP and copyright, that given all the issues we are currently grappling with in these new technologies, not least AI, we should have overarching AI legislation and right-sized regulation, which is always good for all elements of our economy and society. Yes, look at IP and copyright, but we should have an AI authority with AI-responsible officers labelling sandboxes and, crucially, a complete transformation of public engagement.

It seems clear at this stage that when it comes to the Government’s plans for IP and copyright in relation to AI, we should all have serious reservations. I go back to that fundamental truth that there is no question, debate, difficulty or complexity. You either have the rights set out at law or you do not. That should inform all discussions and points around IP and copyright. We should have an approach that goes to the heart of this fundamental truth: it is our data. We decide, determine and choose and then, for citizens, consumers and creatives, we have a real opportunity to say positively, with a hashtag, “#OurAIFutures”.

14:23
Lord Freyberg Portrait Lord Freyberg (CB)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Holmes. I, too, thank the noble Lord, Lord Foster, for initiating this timely debate following Tuesday’s vote on the Data (Use and Access) Bill. As someone with a background in the visual arts and as an artist member of DACS, the Design and Artists Copyright Society, I speak with direct knowledge of these challenges.

The overwhelming majority of creators, whether visual artists, writers, actors or filmmakers, are freelance or self-employed. Recent research from The University of Glasgow’s Centre for Regulation of the Creative Economy reveals a stark crisis: visual artists now earn a median income of just £12,500, a devastating 47% collapse since 2010. Most creators must juggle multiple jobs to survive, and even successful artists are earning only £17,500 annually.

The proposed rights reservation model fundamentally undermines the viability of visual artists’ careers across the country. It does this through a dangerous inversion of copyright principles, principles that creators have long relied upon to secure royalties and safeguard their work. Instead of protecting creators’ existing rights, it imposes costly new burdens requiring them to actively defend protections historically held by default.

As the EU’s AI Act demonstrates, this approach faces insurmountable technical and legal barriers, creating a labyrinth that benefits neither creators nor users. Consider the practical impossibility of an artist enforcing a comprehensive opt-out in our interconnected digital age. Picture a scenario where a museum visitor photographs an opted-out artist’s work and shares it on social media. These platforms routinely permit AI training on user content, inadvertently exposing the artist’s work to the very AI systems they sought to avoid. The artist’s intended opt-out becomes meaningless within seconds of a single smartphone click. This forces creators into an impossible choice: accept unwanted AI training or demand photography bans, unravelling decades of progress in democratising art access.

In today’s digital marketplace, an artist’s online visibility is not merely advantageous but is essential for survival. Their digital presence serves as a virtual gallery, portfolio and business card combined connecting them with collectors, commissioners and collaborators worldwide. Yet the proposed opt-out system creates an impossible dilemma: how can artists protect their work from AI training without simultaneously vanishing from search engines and potential clients? The distinction between beneficial visibility and unwanted AI scraping becomes a technical impossibility. This challenge is compounded by the breakneck pace of technological change in AI development. Web-crawling technologies evolve almost daily, rendering today’s opt-out mechanisms obsolete tomorrow. More troubling still is the retrospective futility of such measures: countless AI models have already ingested vast archives of artists’ works.

At the heart of the visual arts sector lies not corporations but individuals: freelance artists navigating an already complex professional landscape. The opt-out system would drown these artists in administrative complexity, forcing them to master an ever-shifting maze of technical decisions while trying to create art. This crushing burden falls heaviest on those least equipped: individual creators lacking corporate infrastructure and legal expertise. The system presents a cruel paradox. Artists would need to become experts in rapidly evolving AI technologies just to protect their existing rights. They would be forced to make critical decisions about their creative futures under intense time pressure without adequate information or support. How can we expect individual creators to navigate this labyrinth while simultaneously maintaining their artistic practice and earning a living? The answer is simple: we cannot. This system would create an unsustainable burden that disproportionately impacts the most vulnerable members of our creative community.

Consider a professional photographer capturing thousands of images daily. Each photograph represents a separate copyrighted work, yet these images reside in cloud storage vulnerable to AI scraping. Under a rights reservation system, protecting each image becomes a Sisyphean task, turning a day’s creative output into weeks of administrative burden, as the noble Lords, Lord Black and Lord Foster, rightly highlighted. This inversion of creative priorities is fundamentally flawed. Instead of forcing artists to become full-time guardians of their intellectual property, our systems should empower creation and ensure fair compensation. Responsibility for respecting copyright should rest squarely with AI companies that seek commercially to exploit artists’ work, not with the creators themselves. We must reject any framework that transforms artists from creators into perpetual copyright administrators defending their rights against technological encroachment.

The amendments to the data Bill proposed by the noble Baroness, Lady Kidron, chart the only viable path forward, one that brings fair value and legal certainty to creative industries and tech sectors alike. This Government must not succumb to pressure from US-based tech companies peddling the false promise that gutting copyright protection will somehow enrich Britain.

14:29
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I congratulate my noble friend Lord Foster of Bath on securing the debate today and on his penetrating introduction, which included a number of extremely important questions for the Minister.

AI clearly has many creative uses, as Sir Paul McCartney himself emphasised last Sunday. But it is one thing to use the tech and another to be at the mercy of it, as so many noble Lords emphasised in their thoughtful but passionate speeches, both on Tuesday and today. So many outside organisations—I thank them for their briefings—have also made that very clear in what they have said.

The use of IP-protected content for training is a key issue, which has also arisen in relation to generative AI models outside the UK. It is rather a delicious irony that Open AI is now complaining of its own IP being used to train DeepSeek, as my noble friend said. Here in the UK, the Government’s intentions are clear. The new consultation on AI and copyright, reinforced by the AI opportunities plan, has set out a preferred option—this is the key thing—to change the UK’s copyright framework by creating a text and data mining exception where rights holders have not expressly reserved their rights: in other words, an opt-out system.

We all thought this had been put to bed under the last Government, but this Government seem even more intent on creating a Singapore-on-Thames. In response, we have seen the creation of a new campaign across the creative and news industries, Creative Rights In AI Coalition, and Ed Newton-Rex has raised over 37,000 signatures from creators and creative organisations.

Frankly, the creative and news industries are in uproar. As my noble friend Lord Foster says, the proposals were not underpinned by a robust economic case, but the consultation also starts from the false premise of legal uncertainty. As we heard in the debate on the amendment in the name of the noble Baroness, Lady Kidron, on Tuesday, there is no lack of clarity over how AI developers can legally access training data. UK law is clear that commercial organisations, including gen AI developers, must license the data they use to train their large language models. AI developers have already reached agreement with news publishers in a number of cases. Open AI has signed deals with publishers internationally, such as News Corp, Axel Springer, the Atlantic and Reuters. There can be no excuse of market failure. There are well-established licensing solutions administered by a variety of well-established mechanisms and collecting societies.

The consultation says:

“The government believes that the best way to achieve these objectives is through a package of interventions that can balance the needs of the two sectors”.


But what kind of balance is this when it is all take and no give on the part of creatives? The Government have stated that they will move ahead with their preferred “rights reservation” option only if the transparency and rights reservation provisions are

“effective, accessible, and widely adopted”.

However, as we have heard from across the Room today, no effective rights reservation, no system for the use of content by gen AI models, has been proposed or implemented anywhere in the world, which makes the government proposals entirely speculative. The technology does not exist.

The laws around transparency of these activities have not caught up. At present, developers can scrape content from the internet without declaring their identity, or they may use content scraped for one purpose for the completely different commercial purpose of training AI models. How can rights owners opt out of something they do not know about? Once used to train these models, the commercial value has already been extracted from IP scraped without permission, with no way to delete data from these models.

We need transparency and a clear statement about copyright. We absolutely should not expect artists to have to opt out. AI developers must be transparent about the identity and purposes of their crawlers, and have separate crawlers for distinct purposes. Unless news publishers and the broader creative industries can retain control over their data, this will not only reduce investment in creative output but will ultimately harm innovation in the AI sector and, as we have heard, tech developers will lack the high-quality data that is the essential fuel in generative AI.

Retaining the Kidron amendments to address the challenges posed by AI development, particularly in relation to copyright and transparency, is in my view, and that of those on these Benches, essential. This should apply regardless of in which country the scraping of copyright material takes place if developers market their product in the UK. It is clear that AI developers have used their lobbying clout to persuade the Government that a new exemption from copyright in their favour is required. As a result, the Government seem to have gone soft on big tech. In response, my party, creators, the creative industries and many other supporters will be vigorously opposing Government plans for a new text and data-mining exemption.

The Minister has been posed a number of key questions by my noble friend Lord Foster and many others, including the noble Lord, Lord Black of Brentwood. I put another question to him: will he now agree to withdraw the TDM with an opt-out as the preferred solution? That is one of the key requests of the creative industries; they would be dancing in the streets if the Minister said that today.

14:35
Viscount Camrose Portrait Viscount Camrose (Con)
- Hansard - - - Excerpts

I thank all noble Lords for their uniformly brilliant contributions to this important debate. I particularly thank the noble Lord, Lord Foster, for securing this debate and introducing it so powerfully. To start with a statement of the obvious: artificial intelligence can do us great good and great harm. I know we are hare mainly to avert the latter, but I open with a few thoughts on the former.

I should like to make two points in particular. First, the UK is often said to have a productivity problem and AI, even at its current level of capability, offers a great chance to fix this by automating routine tasks, improving decision-making and streamlining workflows. Secondly, it was often said, since the early days of e-commerce, that innovative use of technology was the preserve of the private sector, whereas the public sector was less nimble and consequently less productive. Those days must soon be over. Some of the best datasets, especially in this country, are public: health, education and geospatial in particular. Safely exploiting them will require close public-private collaboration, but if we are able to do so—and, I stress, do so safely—the productivity rewards will be extraordinary. This is why we, on these Benches, greatly welcome the AI action plan.

AI’s potential to revolutionise how we work and create is undeniable. In the creative industries, we have already seen its impact, with more than 38% of businesses incorporating AI technologies into their operations as of late last year. Whether in music, publishing, design or film, AI offers tools that enhance productivity, enable innovation, and open new markets. However, the key to all these prizes is public acceptance, the key to public acceptance is trustworthiness, and the key to trustworthiness is not permitting the theft of any kind of property, physical or intellectual.

This brings us to copyright and the rights of creators whose works underpin many of these advances. Copyright-protected materials are often used to train AI systems, too often without the permission, or even knowledge, of creators. Many persuasive and powerful voices push for laws, or interpretations of laws, in this country that prevent this happening. If we are able to create such laws, or such interpretations, I am all for them. I am worried, however, about creating laws we cannot enforce, because copyright can be enforced only if we know it has been infringed.

The size and the international distribution of AI training models render it extremely challenging to answer the two most fundamental questions, as I said on Tuesday. First, was a given piece of content used in a training model? Secondly, if so, in what jurisdiction did this take place? An AI lab determined to train a model on copyrighted content can do so in any jurisdiction of its choice. It may or may not choose to advise owners of scraped content, but my guess is that for a large model of 100 billion parameters, the lab might not be as assiduous in this as we would like. So, enforcement remains a significant challenge. A regulatory framework that lacks clear, enforceable protections risks being worse than ineffective in practice: it risks creating false confidence that eventually kills trust in, and public acceptance of, AI.

So, although we welcome the Government’s decision to launch a public consultation to address these challenges, it is vital that it leads to an outcome that does three things. First, needless to say, it must protect products of the mind from unlawful exploitation. Secondly, it must continue to allow AI labs to innovate, preferably in the UK. Thirdly, it must be enforceable. We all remember vividly Tuesday’s debate on Report of the DUA Bill. I worry that there is a pitfall in seeing AI and copyright policy as a zero-sum struggle between the first two of those objectives. I urge noble Lords, especially the Minister, to give equal emphasis and priority to all three of those goals.

I shall close with a few words on standards. As the Minister has rightly recognised, the key to an enforceable regime is internationally recognised technical standards, particularly, as I have argued, on digital watermarks to identify copyrighted content. A globally recognised, machine-readable watermark can alert scraping algorithms to copyrighted materials and alert rights holders to the uses of their materials. It may even allow rights holders to reserve their rights, opt out automatically or receive royalties automatically. In Tuesday’s debate, I was pleased to hear the Minister confirm that the Government will consider such standards as part of the consultation response.

Of course, the challenge here is that any such standards are—this is the bluntest possible way I can put it—either internationally observed and accepted or pointless. In this country, we have an opportunity to take the lead on creating them, just as we took the lead on setting standards for frontier AI safety in 2023 at Bletchley Park. I urge the Minister to strain every sinew to develop international standards. I say now that I and my party are most willing to support and collaborate on the development of such standards.

14:42
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Foster, for introducing this debate and everyone who contributed. Clearly, several of the amendments that we discussed earlier in the week have been touched on in one form or another in today’s debate. The fact that those amendments were voted through demonstrates the intensity of noble Lords’ passion for and interest in this topic; of course, that is recognised. I acknowledge clearly, because I was asked this question, that I recognise the importance of these issues and I absolutely understand the concerns of the creative industries and, as the noble Lord, Lord Black, mentioned, the media sector.

In some ways, what we have discussed today speaks directly to the question of whether we need a consultation. On 17 December, we published a consultation that seeks to deliver a competitive copyright regime and a package of measures that support our creative industries and the AI sector. I do not want to sound like a broken record, but the proposals aim to deliver three objectives, and I agree with the way the noble Viscount, Lord Camrose, framed objectives. The three objectives that we have put forward are: transparency about the use of copyrighted works to train AI models and AI-generated content, providing greater control for rights holders’ material so that they can be supported in protecting it and can be remunerated where it is used—again, I say that the aim here is quite the opposite of theft: it is to give more control—and enhancing lawful access to the material to be used to train world-leading AI models.

I reiterate what I said on Tuesday: this is a genuine consultation, and many people from a range of sectors are engaging to share their views and evidence. The Government continue to believe that it is important that we have the benefit of that public consultation before we act. A central issue that the noble Lord, Lord Foster, set it in his Question is how to make sure that rights holders can easily reserve their rights and control the use of their material. These are the challenges that rights holders face today. Although they may have copyright on their work, they are often unable in practice to control how it is used or to gain remuneration. This is often particularly true for new or solo artists, the very people we need to protect, a point that the noble Lord, Lord Holmes, and others made.

The rights reservation model proposed in the consultation aims to enhance rights holders’ ability to withdraw their content from being used. It would support their ability to license this content for use with AI if they wish to do that. To do this, we will need the right blend of technology and regulation, and the consultation seeks views on how this should be achieved. Importantly—many noble Lords raised this point—this model would have to be simple, effective and accessible for rights holders of all sizes, something that, frankly, is not available in the current position. The Government have been clear that we will not proceed with this model unless we are confident that these criteria will be met.

On transparency, we want to consider how to achieve this broadly, ensuring that rights holders understand how and where their content is used, while also ensuring any measures are not disproportionate for small businesses and individuals.

On our third objective, access, for all the reasons that the noble Viscount, Lord Camrose, said, we want to ensure that there is a system in place that allows AI developers to access the high-quality material they need to train world-leading models in the UK. We want that access to be without uncertainty and without legal liability slowing down investment and adoption.

These are undoubtedly complex issues, and we need to strike the right balance to ensure that we are able fully to benefit from AI and guarantee the success of our world-leading creative industries. This is why we are asking about all these elements in the consultation.

The question asked by the noble Lord, Lord Foster, raises important issues about the impacts on creators and our assessment of these impacts. This was also something mentioned in the debate earlier in the week. I reassure noble Lords that gathering further economic impact evidence is one of the main reasons for conducting a full inquiry, but it is also worth pointing out that alongside our proposed paper on this, we published a 22-page summary options assessment that set out its initial analysis of the proposals that we have put forward, so it is not correct that there has been no options impact appraisal. This options assessment received a green rating from the independent Regulatory Policy Committee. It recognises, however, that quantitative evidence is currently limited in this area and highlights areas where the Government hope to receive further data during the course of the consultation.

The options assessment sets out the expected impacts of different options and assesses them against those three objectives in the consultation: control, access and transparency. The assessment does not provide detailed data on economic impact, as publicly available evidence in this area is currently rather limited. It is important that we let the consultation run its course so that we can gather evidence of impacts on the full range of affected parties. We are particularly keen for respondents to the consultation to provide further economic evidence to inform how we achieve our objectives. To answer partially, without being able to have singing in the streets, the question from the noble Lord, Lord Clement-Jones, depending on the evidence we receive through the consultation, we will revise, update and expand on the assessment of the options and better determine how we move forward with any potential legislative change. Acting without this would risk imposing legislation that does not have the intended effects.

Alongside our analysis, the Government of course continue to consider a broad range of external studies to assess AI’s economic impact. Modelling the potential economic impact of AI is complicated, and there are several external studies on this. We know that it is complicated, as we have seen just this week with the entry of DeepSeek and how that may change many of the things we think about, but AI adoption has the potential to drive growth across the economy, including, as many noble Lords mentioned, in the creative industries, where more than 38% of creative industry businesses have used AI technologies as of September 2024, with nearly 50% using AI to improve business operations. Earlier this week, I attended the launch of the Institute for the Future of Work’s report into the future of work and well-being, which looks at the impact of AI on work and well-being in all sectors. The Government have considered this external evidence alongside our internal analysis to inform our approach to AI and will continue to do so.

I will now move on to a few other areas. In passing, I agree with the noble Lord, Lord Black, that the question of truth in the effect of AI is crucial. We are in an era where this is increasingly difficult; it is the first wave of the AI challenge. It is crucial for everybody in society and, of course, for the media. Technology will play an important part in delivering greater rightholder control. The Government are clear that any solutions need to be effective, proportionate and accessible to all parties of all sizes, and they must be easy to use. Again, I want to reassure noble Lords that we do not intend to go forward with this approach until we are confident that this is the case.

The noble Lord, Lord Foster, asked whether anything is already available. Things are available; they are not good enough yet but coming along very fast. I know from my time as chair of the Natural History Museum, where we looked after vast amounts of data of huge potential value, that we had ways to try to block people getting hold of it. Things are available now but they need to be better; they also need to be simpler and usable by the individual.

The consultation recognises that more detailed work needs to be done, and an important function of the consultation is to help us work through this detail. A number of industry initiatives are already under way to deliver effective standards. As has been mentioned, these standards—international and national—will be crucial. These efforts, combined with careful regulation, will make it possible to deliver workable rights reservation tools, and a reimbursement mechanism that, again, should be easy to operate and not available only to the largest players or by going to court.

As noble Lords have raised it during the passage of the data Bill, I reiterate the central importance of transparency in the way that creative content is used. The use of web crawlers, metadata and watermarks as different forms of technological solutions could have a number of benefits for those who wish to control or license the use of their content with AI and could provide the very basis for a rights reservation tool.

We agree that a key issue to be addressed is the role of some web crawlers that are used to obtain content for AI training. However, it is important to recognise that web crawlers are used for different purposes, the most familiar being indexing online content so that it can be searched with a search engine. Standards on the use of web crawlers may also be important to improve the ability of rightholders to prevent the use of work against their wishes.

I spoke about workability, and several noble Lords made it clear that it must mean workability for the creative sector and creatives, as well as for others. The noble Lord, Lord Foster, asked about the temporary copy issue. We have asked about that in the consultation.

To conclude, I again thank noble Lords for contributing to this debate. They can rest assured that the Government understand the strongly held and legitimate concerns which creators and rightholders have about their content being used. We also agree that transparency is fundamental. However, it would be wrong to commit to specific legislation while the Government’s consultation is ongoing. Indeed, we should and must consider stakeholders’ responses fully and progress our package of objectives together.

We will consider all the points raised by noble Lords today and during the passage of the Bill. We will do this alongside the responses and evidence received as part of the consultation, before bringing further proposals. I end on the specific point raised by the noble Lord, Lord Holmes, on the LAION case, which is under German law. I will ask the IPO to give him a full answer on that.

14:54
Sitting suspended.