Artificial Intelligence (Regulation) Bill [HL]

Lord Freyberg Excerpts
2nd reading
Friday 22nd March 2024

(1 month, 1 week ago)

Lords Chamber
Read Full debate Artificial Intelligence (Regulation) Bill [HL] 2023-24 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Freyberg Portrait Lord Freyberg (CB)
- View Speech - Hansard - -

My Lords, I too am very grateful to the noble Lord, Lord Holmes of Richmond, for introducing this important Artificial Intelligence (Regulation) Bill. In my contribution today, I will speak directly to the challenges and threats posed to visual artists by generative AI and to the need for regulatory clarity to enable artists to explore the creative potential of AI. I declare my interest as having a background in the visual arts.

Visual artists have expressed worries, as have their counterparts in other industries and disciplines, about their intellectual property being used to train AI models without their consent, credit or payment. In January 2024, lists containing the names of more than 16,000 non-consenting artists whose works were allegedly used to train the Midjourney generative AI platform were accidentally leaked online, intensifying the debate on copyright and consent in AI image creation even further.

The legality of using human artists’ work to train generative AI programmes remains unclear, but disputes over documents such as the Midjourney style list, as it became known, provide insight into the real procedures involved in turning copyrighted artwork into AI reference material. These popular AI image-generator models are extremely profitable for their owners, the majority of whom are situated in the United States. Midjourney was valued at around $10.5 billion in 2022. It stands to reason that, if artists’ IP is being used to train these models, it is only fair that they be compensated, credited and given the option to opt out.

DACS, the UK’s leading copyright society for artists, of which I am a member, conducted a survey that received responses from 1,000 artists and their representatives, 74% of whom were concerned about their own work being used to train AI models. Two-thirds of artists cited ethical and legal concerns as a barrier to using such technology in their creative practices. DACS also heard first-hand accounts of artists who found that working creatively with AI has its own set of difficulties, such as the artist who made a work that included generative AI and wanted to distribute it on a well-known global platform. The platform did not want the liabilities associated with an unregistered product, so it asked for the AI component to be removed. If artists are deterred from using AI or face legal consequences for doing so, creativity will suffer. There is a real danger that artists will miss out on these opportunities, which would worsen their already precarious financial situation and challenging working conditions.

In the same survey, artists expressed fear that human-made artworks will have no distinctive or unique value in the marketplace in which they operate, and that AI may thereby render them obsolete. One commercial photographer said, “What’s the point of training professionally to create works for clients if a model can be trained on your own work to replace you?” Artists rely on IP royalties to sustain a living and invest in their practice. UK artists are already low-paid and two-thirds are considering abandoning the profession. Another artist remarked in the survey, “Copyright makes it possible for artists to dedicate time and education to become a professional artist. Once copyright has no meaning any more, there will be no more possibility to make a living. This will be detrimental to society as a whole”.

It is therefore imperative that we protect their copyright and provide fair compensation to artists whose works are used to train artificial intelligence. While the Bill references IP, artists would have welcomed a specific clause on remuneration and an obligation for owners of copyright material used in AI training to be paid. To that end, it is therefore critical to maintain a record of every work that AI applications use, particularly to validate the original artist’s permission. It is currently not required by law to reveal the content that AI systems are trained on. Record-keeping requirements are starting to appear in regulatory proposals related to AI worldwide, including those from China and the EU.

The UK ought to adopt a similar mandate requiring companies using material in their AI systems to keep track of the works that they have learned and ingested. To differentiate AI-generated images from human-composed compositions, the Government should make sure that any commercially accessible AI-generated works are branded as such. As the noble Lord, Lord Holmes, has already mentioned, labelling shields consumers from false claims about what is and is not AI-generated. Furthermore, given that many creators work alone, every individual must have access to clear, appropriate redress mechanisms so that they can meaningfully challenge situations where their rights have been misused. Having said that, I welcome the inclusion in the Bill that any training data must be preceded by informed consent. This measure will go some way to safeguarding artists’ copyright and providing them with the necessary agency to determine how their work is used in training, and on what terms.

In conclusion, I commend the noble Lord, Lord Holmes, for introducing this Bill, which will provide much-needed regulation. Artists themselves support these measures, with 89% of respondents to the DACS survey expressing a desire for more regulation around AI. If we want artists to use AI and be creative with new technology, we need to make it ethical and viable.