Creative Industries: Rights Reservation Model

Lord Holmes of Richmond Excerpts
Thursday 30th January 2025

(2 days, 21 hours ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to follow my noble friend Lord Black. I congratulate the noble Lord, Lord Foster, on securing this timely and excellent debate. In doing so, I declare my interests as set out in the register—in particular my technology interests, not least as an adviser to Socially Recruited, which is an AI business.

As the noble Lord, Lord Foster, has already set out, we had an excellent debate on Tuesday night. My question for this afternoon is: how much does it cost to develop and train a foundation model? Is it £500 billion or £5 million? Is it somewhere in between? I do not know, but here is what we do know. The cost of current foundational models is felt by our creatives: the musicians who make sounds where there would otherwise be silence; and the writers who fill a blank page with words that touch our human hearts and souls and, sometimes, change the course of human history. They are paying the cost of the current “model” that we have.

How can it be not only that they are currently footing the cost but that the potential, proposed approach to this issue will put the onus on them to assert their rights? There is that onus, the cost, pressure and stress and, ultimately, the impossibility of doing this with an opt-out model. My first question to the Minister is: can it ever be so that opting out could work? How could it ever bring the certainty, clarity and consistency that we require? As a helpful example, can the Minister say something about the recent LAION case and the light that that throws on this matter?

There is a real tedium to this TDM discussion. It is just that an obvious and irrefutable truth is wilfully ignored and pushed to one side. If you own a copyright or have IP rights, you hold and own those rights. If you do not, the truth is simple and unquestionable: those rights are not yours. That should be the guiding principle when considering any potential approach to IP and copyright in relation not just to AI but to the fact that we have hundreds of years of legal certainty which comes from this.

How would the Minister define a proper and workable model for the preservation of these rights? What would he say to individuals and small entities about the cost, pressure and impossibility of seeking to enforce their rights? How does he intend transparency to be an important thread that runs through this alongside the technical? What about post-ingestion and, if we get to the point of some potential change, what about all that protected material already ingested deep into the engine room of these models?

What attracts businesses, investors and innovators to the UK from a regulatory and legislative perspective? It is certainty, clarity and consistency. In no sense can we say that we have those right now in our country. That is why I believe, not only when it comes to IP and copyright, that given all the issues we are currently grappling with in these new technologies, not least AI, we should have overarching AI legislation and right-sized regulation, which is always good for all elements of our economy and society. Yes, look at IP and copyright, but we should have an AI authority with AI-responsible officers labelling sandboxes and, crucially, a complete transformation of public engagement.

It seems clear at this stage that when it comes to the Government’s plans for IP and copyright in relation to AI, we should all have serious reservations. I go back to that fundamental truth that there is no question, debate, difficulty or complexity. You either have the rights set out at law or you do not. That should inform all discussions and points around IP and copyright. We should have an approach that goes to the heart of this fundamental truth: it is our data. We decide, determine and choose and then, for citizens, consumers and creatives, we have a real opportunity to say positively, with a hashtag, “#OurAIFutures”.

Data (Use and Access) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
38: Clause 90, page 113, line 15, at end insert “in accordance only with the Commissioner’s duties under section 108 of the Deregulation Act 2015 (exercise of regulatory functions: economic growth).”
Member’s explanatory statement
This amendment ensures that the Commissioner’s duty to have regard to the desirability of promoting innovation is referable only to the duty imposed under section 108 of the Deregulation Act 2015. This amendment seeks to ensure that the Commissioner’s status as an independent supervisory authority for data protection is preserved given that such status is an essential component of any EU adequacy decision.
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to open the second day on Report on the Data (Use and Access) Bill. In doing so, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business. In moving Amendment 38 in my name, I will not speak to any other amendments in this group.

Amendment 38 goes to the heart of the issue du jour: regulators have seldom been so much in the press and in the public eye. As the press would have it, they were hauled into No. 11 just a few days ago, but this speaks to what we want from our regulators across our economy and society. At their best, our regulators are the envy of the world. Just consider the FCA when we did the fintech regulatory sandbox: as a measure of success, it was replicated in well over 50 jurisdictions around the world.

We know how to do right-sized regulation and how to set up our regulators to succeed to do that most difficult of tasks—to balance innovation, economic growth, and consumers’ and citizens’ rights. That is what all regulators should be about. It is not straightforward; it is complex but entirely doable.

Amendment 38 simply proposes wording to assist the Information Commissioner’s Office. When it comes to the economic growth duty—“#innovation”—it simply refers back to Section 108 of the 2015 Act. I believe that bringing this clarity into the Bill will assist the regulator and enable all the conversations that are rightly going on right now, and all the plans that are being produced and reported on, such as those around AI, to be properly discussed and given proper context, with an Information Commissioner’s Office that is supported through clarity as to its responsibilities and obligations when it comes to economic growth. In simple terms, this would mean that these responsibilities are restricted and clearly set out according to Section 108 of the 2015 Act. It is critical that this should be the case if we are to have clarity around the commissioner’s independence as a supervisory authority on data protection, an absolutely essential condition for EU adequacy decisions.

I look forward to the Minister’s response. I hope that he likes my drafting. I hope that he will accept and incorporate my amendment into the Bill. I look forward to the debate. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I rise to support Amendment 38 in the name of the noble Lord, Lord Holmes. More than ever before, the commissioner, alongside other regulators, is being pressured to support the Government’s growth and innovation agenda. In Clause 90, the Bill places unprecedented obligations on the ICO to support innovation. The question, in respect of both the existing growth duty and Clause 90, is whether they are in any sense treated as overriding the ICO’s primary responsibilities in data protection and information rights. How does the ICO aim to balance those duties, ensuring that its regulatory actions support economic growth while maintaining necessary protections?

We need to be vigilant. As it is, there are criticisms regarding the way the Information Commissioner’s Office carries out its existing duties. Those criticisms can be broadly categorised into issues with enforcement, independence and the balancing of competing interests. The ICO has a poor record on enforcement; it has been reluctant to issue fines, particularly to public sector organisations. There has been an overreliance on reprimands, as I described in Committee. The ICO has been relying heavily on reprimands, rather than stronger enforcement actions. It has also been accused of being too slow with its investigations.

There are concerns about these new duties, which could pose threats to the ability of the Information Commissioner’s Office to effectively carry out its primary functions. For that reason, we support the amendment from the noble Lord, Lord Holmes.

--- Later in debate ---
Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- View Speech - Hansard - - - Excerpts

I thank the noble Lord, Lord Holmes, for his Amendment 38 relating to the ICO’s innovation duty. I agree with his comments about the quality of our regulators.

I reiterate the statements made throughout the Bill debates that the Government are committed to the ongoing independence of the ICO as a regulator and have designed the proposals in the Bill with retaining EU adequacy in mind. The commissioner’s status as an independent supervisory authority for data protection is assured. The Information Commissioner has discretion over the application of his new duties. It will be for him to set out and justify his activities in relation to those duties to Parliament.

To answer the specific point, as well as that raised by the noble Lord, Lord Clement-Jones, considerations of innovations will not come at the expense of the commissioner’s primary objective to secure an appropriate level of protection for personal data. I hope that reassures the noble Lord.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

I thank all noble Lords who have taken part in this short debate and thank the Minister for his response. I believe my wording would assist the ICO in its mission, but I have listened to what the Minister has said and, for the time being, I beg leave to withdraw the amendment.

Amendment 38 withdrawn.
--- Later in debate ---
Baroness Cavendish of Little Venice Portrait Baroness Cavendish of Little Venice (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I congratulate my noble friend on a barnstorming speech.

Many of the points that I wanted to make have already been made by others, so I will be brief. I declare my interest as a rights holder. I am slightly worried that this is beginning to sound like special pleading, and I hope that is not the effect it has. I am also the daughter of two writers, and I recognise that £1.76, because sometimes that was it. That £1.76, as the noble Lord has just said, is a contract. There are many artists, musicians and writers in this country who get money for their books in libraries or tiny amounts of royalties, and those royalties are keeping them alive. They enable them to create original work and earn their living.

I believe that generative AI will be transformational and largely for the good. However, it is perfectly possible to distinguish between meaningful progress that advances humanity—we heard in an earlier debate about AI tracking naval ships, and brilliant advances are being made in medicine—and plain theft of intellectual property. That theft has been going on now for several years, and the people who are being stolen from are not even aware that their work has been stolen.

For that reason, I do not actually believe it is necessary to seek a balance. This is not about balance; it is about implementing and upholding the rule of law. The proposed rights reservation from the Government would reverse the fundamental principle of UK copyright law, which, as others have said, was established in 1710—I think it was 1710, not 1709, but we may differ. My mother wrote the Handbook of Copyright in British Publishing Practice in 1974, so I have some visceral memory of all this. The Government are proposing to reverse the fundamental protections that have made us a gold standard in the world. The amendments propose to make UK copyright law enforceable in an age of generative AI—to respond and expand our laws, in what is in my view an extremely proportionate way, to recognise the rights of creators.

We have all learned something in this debate that is astonishing to me: apparently the Government have not conducted an economic impact assessment of their proposals on one of our most successful industries. I find that completely shocking. It suggests a lack of seriousness on the part of this Government and those who are making these proposals, which I hope the Minister will address later.

If artists, musicians and creators cannot earn a living, there will be no original content and no more content for AI to build on. That is surely in itself an economic argument that somewhat undermines the vague idea that innovation cannot happen without the wholesale abolition of our proud tradition of copyright. Chris Bryant said last night that something must change and that we cannot do nothing. I agree, but what we must do is double down.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, I support these amendments and the noble Baroness, Lady Kidron. Not to do so would be, to quote some of her earlier work, beyond the edge of reason.

I support the noble Baroness because I support creatives. They are the individuals who bring such sweet sound where otherwise there would be silence, who fill a blank page with words that can move our hearts, our souls and our minds, and can change the course of history. I support the amendments because I support the rule of law. IP and copyright are well established over centuries.

This is not complex or controversial. There is an extraordinary tedium to the whole question of TDM. Ultimately, I could do this in three words when addressing big tech: “It’s not yours. Take your audacious hands off other people’s work”. And that is from someone who is pro-innovation, pro-AI and pro-technology—but in a way where there is a negotiation and agreed conclusion as to how artists, rights holders and creatives want to engage with these technologies.

We have already heard many times, rightly, that there has been no economic impact assessment. I ask the Minister for his views on that. While on that subject, I ask him, out of genuine interest, what is the genesis of the £400 billion figure in the AI opportunities plan? Where does it come from, what is it based on and how does it sit against the impact that not acting will have on our creative sector?

I support these amendments, and I urge everyone in your Lordships’ House to do so. To misquote the late, great Dennis Potter, “Vote, vote, vote for Beeban Kidron”.

Baroness Butler-Sloss Portrait Baroness Butler-Sloss (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I have come specifically to the debate on this part of the Bill especially to support these amendments. I regret that I have not played a part in any other part of the Bill, but this subject is so important that I have come—and I shall speak briefly because I support what everyone else has said.

I am coming from a totally different angle. As a judge, I tried these cases, and they worked perfectly well. We never had a problem in coming to a decision on copyright or intellectual property. I did not do very many, but I sat with judges who did it all the time. I am absolutely astonished that the Government are setting aside long-established law; whether it goes back to 1709 or 1710—whether it is the noble Baroness, Lady Cavendish, or the noble Earl, Lord Devon, who is right—I do not think matters. The point is that it goes back a long way, and it works. Why are the Government setting it aside instead of strengthening it, for all the reasons that have been given so far?

I wonder whether, in the absence of an impact assessment, the Government have put their mind to what is going to happen on the ground, and not just with regard to the £1.76. Is the £128 billion going to exist to go into the coffers of the Treasury? I suspect that, whatever they think they are going to make, no one from the government Benches has thought about what they are going to lose. Basically, I am asking the Government to sit back, think again and reflect with the greatest possible care on the brilliant speech of noble Baroness, Lady Kidron, and the unanimity across this House. Having been in this place for many years, I cannot remember another occasion where I have not heard a single voice supporting the Government. Are the Government going to listen to that?

--- Later in debate ---
Moved by
47: After Clause 107, insert the following new Clause—
“Data use: defences to charges under the Computer Misuse Act 1990(1) The Computer Misuse Act 1990 is amended as follows.(2) In section 1, after subsection (3) insert— “(4) It is a defence to a charge under subsection (1) to prove that—(a) the person’s actions were necessary for the detection or prevention of crime, or(b) the person’s actions were justified as being in the public interest.”(3) In section 3, after subsection (6) insert—“(7) It is a defence to a charge under subsection (1) in relation to an act carried out for the intention in subsection (2)(b) or (c) to prove that—(a) the person’s actions were necessary for the detection or prevention of crime, or(b) the person’s actions were justified as being in the public interest.””Member’s explanatory statement
This amendment updates the definition of “unauthorised access” in the Computer Misuse Act 1990 to provide clearer legal protections for legitimate cybersecurity activities.
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, in moving Amendment 47, I shall speak also to Amendment 48.

Here we are again: the Computer Misuse Act 1990 is another year older. It was put into statute at a time when technology looked nothing like it did 10 or 20 years ago, never mind today. I will give some brief facts. We have a fantastic cyber sector in our country, which adds so much to our economy and safety. The Computer Misuse Act constrains the sector from keeping us as safe as it might and constrains businesses in terms of their growth and what they could be adding today to our economy in terms of—yes—growth.

There is no reason for us to continue with the Computer Misuse Act when we have the solution in our hands, set out, I suggest, in Amendments 47 and 48. Our cyber- security professionals, often working way out of sight, for obvious reasons, do such important work and professionally, diligently, keep us safe and keep our country, assets and economy secure.

When the Minister responds, will he say, even sotto voce, that a Division on these amendments might help him in his discussions within the department to get some movement on this issue? We heard in previous debates how doing this would be premature and how the time was not now. Well, for a statute that came into being at the beginning of the 1990s, I suggest that it is high time that we made these amendments for individuals, for businesses, for our economy and for our society, in an extraordinarily uncertain world and at a time when I imagine that every Minister should be looking to every potential source of economic growth. I look forward to the debate and to the Minister’s response. I beg to move.

Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- View Speech - Hansard - - - Excerpts

My Lords, in Committee, the noble Baroness the Minister said there was no consensus on the best way forward to amend the law to provide protection for ethical hackers trying to work against cybercrime. All I ask is that noble Lords should read the amendment, which says:

“It is a defence to a charge … to prove that … the person’s actions were necessary for the detection or prevention of crime or … the person’s actions were justified as being in the public interest”.


What on earth could be wrong with that? I support my noble friend Lord Holmes of Richmond.

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I am grateful to the noble Lord, Lord Holmes, for raising this topic through Amendments 47 and 48. I am very aware of this issue and understand the strength of feeling about reforming the Computer Misuse Act, as we have heard from the noble Lord, Lord Arbuthnot, and the noble Earl, Lord Erroll.

As the noble Lord, Lord Clement-Jones, rightly pointed out, when I was the Government Chief Scientific Adviser I conducted a review making recommendations on pro-innovation regulation of technologies and I made recommendations on the issues these amendments raise. These recommendations were accepted by the previous Government.

The Government are actively taking forward these recommendations as part of the Act’s ongoing review. These issues are, of course, complex and require careful consideration. The introduction of these specific amendments could unintentionally pose more risk to the UK’s cybersecurity, not least by inadvertently creating a loophole for cybercriminals to exploit to defend themselves against a prosecution.

Our engagement with stakeholders has revealed differing views, even among industry. While some industry partners highlight the noble Lord’s view that the Computer Misuse Act may prevent legitimate public interest activity, others have concerns about the unintended consequences. Law enforcement has considerable concerns that allowing unauthorised access to systems under the pretext of identifying vulnerabilities could be exploited by cybercriminals. Without robust safeguards and oversight, this amendment could significantly hinder investigations and place a burden on law enforcement partners to establish whether a person’s actions were in the public interest.

Further work is required to consider the safeguards that would need to accompany any introduction of statutory defences. The Government will continue to work with the cybersecurity industry, the National Cyber Security Centre and law enforcement agencies on this issue. The Home Office will provide an update in due course, once the proposals have been finalised—or, in the words of the noble Lord, Lord Clement-Jones, they will pop out of the bowels of the Home Office in due course. With these reassurances in mind, I hope the noble Lord will feel able to withdraw his amendments.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, I thank everybody who has taken part in this short debate. I was really hoping that we would not hear the phrase “the bowels of the Home Office” twice, but we did—now we have heard it three times. Perhaps it could be the title of somebody’s autobiography. I do not know whose, but I claim the IP rights even though the noble Lord, Lord Clement-Jones, said it first.

I am grateful for the Minister’s response. It would probably have been better to have some sense of timeline; much of what he said was very much what we heard in Committee. We are all amenable to having a course of action, but it needs more objectives attached to it as to when we are likely to see some consequences, action and changes. As every day goes by, as the Minister is well aware, risks go unchecked that could be checked, people are less safe who could be made safe and economic growth, the Government’s priority, is prevented which could be enabled.

For now, I will withdraw my amendment, but I am minded to see what is possible between now and Third Reading, because the time is now; otherwise, “in due course” will be even longer than the official statement “later in the summer”. I beg leave to withdraw.

Amendment 47 withdrawn.

Data (Use and Access) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
59: After Clause 132, insert the following new Clause—
“Data use: review of large language models(1) On the day on which this Act is passed, the Secretary of State must launch a review to consider the introduction of standards for the input and output of data of large language models which operate and generate revenue in the United Kingdom.(2) The review must consider—(a) the applicability of similar standards, such as those that already exist in industries such as pharmaceuticals, food and drinks,(b) whether there is a need for legislative clarity under section 27 of the Copyright, Designs and Patents Act 1988 about whether the input and output of large language models constitute an “article”, and(c) whether a minimum standard should be a condition for market access.”
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, in moving Amendment 59, I shall also speak to Amendments 60 and 66 in my name.

AI has been a recurrent theme running through most, if not all, our discussions on the Bill, because it is utterly absent from the Bill. It seems extraordinary: what is AI without data, and what is a data Bill without AI being considered? It is difficult to see how we will have the clarity, consistency and coherence of approach to address the opportunities and challenges of all these new technologies, not least artificial intelligence, when it has remained absent from the Bill by government design.

Amendment 59 asks about the categorisation and classification of large language models in the UK in terms of the data input and the output from those models. Will the Minister specifically address his comments on this amendment to the issues around Section 27 of the copyright Act of 1988 and how that interacts with the needs of LLMs, whether there should be issues around market access for these large tech companies and whether LLMs in themselves constitute an article under the 1988 Act?

If AI is absent from the Bill by government design, perhaps even more curiously, data centres are largely absent. If AI is nothing without data, what is data without data centres? They are the factories and the boundaries fuelling this new fourth industrial revolution. Data has often been described as the new oil; I suggest that it is nothing of the sort, but we need so much actual new oil—that is, the renewables and SMRs—if we are to power this fourth industrial revolution, not least the data centres therein.

Amendment 60 looks at the current supply of data centres. Is the Minister satisfied not just with how quickly the Government plan to have data centres coming onstream but how possible it is for them to be in places where they can be hooked up to the grid, not just for existing fuels but, crucially, for renewables and potentially SMR technologies, which will absolutely be required if this fourth industrial revolution is to be not only efficient and effective but sustainable?

--- Later in debate ---
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Holmes, for his amendments on reviews of and consultations on large language models and data centres. First, on Amendment 59, as we have discussed in some detail, the Government are conducting their consultation on copyright and AI. This will consider issues relating to transparency of creative content in both input and output of AI. This would apply not just to large language models but to other forms of AI. Questions on the wider copyright framework are also included in the consultation, including the issue of models trained in other jurisdictions, importation and enforcement provisions.

A review of large language models, as required by this amendment, as well as the consideration of the specific provisions of copyright law, would prejudge the outcome of that consultation. I might even go so far as to say to noble Lords that the consultation and the process around it is, in a sense, the very review that this amendment seeks—or at least a range of ways may be suggested through that consultation to address these issues, which are important and might be more effective than a further review. I also remind noble Lords about the AI Safety Institute, which, of course, has a duty to look at some of the safety issues around these models.

I reassure noble Lords that we welcome those suggestions and will carefully consider which parts of the copyright framework would benefit from amendment. I reiterate that the proposals the Government have put forward on copyright and AI training will not affect the wider application of copyright law. If a model were to output a creator’s work without their permission, rights holders would be able to take action, as they are at present.

On Amendment 60, as the Prime Minister laid out as part of the AI opportunities action plan, this Government intend to secure more data centre capacity and ensure that it is delivered as sustainably as possible. Noble Lords will have also noted the investment that followed the investment summit targeted towards data centres. The Government are committed to ensuring that any negative impact of data centres is, where possible, minimised and that sustainability is considered. The noble Lord may well be aware of the creation of the AI energy council, which will be led by Secretaries of State for DSIT and DESNZ. That will consider the energy requirements and, of course, the need for future energy requirements, including things such as SMRs. The Government recognise the aim of this amendment, but we do not feel this Bill is the place to address this issue. The accompanying notes to the Bill will detail its environmental impacts.

Amendment 66 calls for a consultation on data centre power usage. The UK has committed to decarbonising the electricity system by 2030, subject to security of supply, and data centres will increasingly be powered by renewable energy resources. The first data centre site has been identified as Culham. Why is it there? It is because the UK Atomic Energy Authority has a very large power supply, with some 100 megawatts of electricity supply available. That will need to increase to something closer to 500 megawatts. How we will select other data centre sites will depend on where there is power and an appropriate ability to put those sites. Noble Lords can expect them to be distributed around the UK. The sector operates under a climate change agreement, to encourage greater uptake of energy-efficiency measures among operators.

Data centres themselves, of course, play a major part in powering the high-tech solutions to environmental challenges, whether that is new tech that increases the efficiency of energy use across towns and cities or development and application of innovative materials and new technologies that take carbon out of the atmosphere. The energy efficiency of data centres themselves is improving with new technologies and will continue to do so. Perhaps that was one of the features of the announcement of DeepSeek—exactly how that might advance rather rapidly. Closed-loop cooling, energy-efficient hardware, heat reuse and hot/cold aisle containment are already having an effect on the energy consumption and output of data centres.

The Government continue to monitor the data centre industry and are aware of the environmental impacts of data centres. I hope that, in the light of the points I raised, the noble Lord will be content not to press his amendments.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

I thank everyone who took part in this short debate, in particular the Minister for that full, clear and helpful answer. In a spirit of throwing roses at this stage of the evening, I congratulate him and the Government on the quick identification and implementation of Culham as the first site for one of these centres. It makes complete sense—as he says, the power already exists there. I urge the Government to move with such speed for the remaining five of the first six sites. It makes complete sense to move at speed to identify these resources and the wider benefits they can bring to the communities where they will be located. For now, I am content to withdraw the amendment.

Amendment 59 withdrawn.

Artificial Intelligence Opportunities Action Plan

Lord Holmes of Richmond Excerpts
Thursday 16th January 2025

(2 weeks, 2 days ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

My noble friend is quite right. The energy issue is crucial for any plan for AI, and that is why the energy council is being set up. It is precisely why Culham is the first place identified; it has a significant energy supply already. We anticipate that the centres will be based around the country in places where there is renewable energy or where other sources of energy can be accessed easily in order to provide the power the centres require. It is also important that the council looks at the overall environmental impact, which will be part of this.

On energy consumption, it is known what is required for a single data centre and, as we need multiple data centres, the type and amount we will require is known. It is crucial that this is done on top of everything else that the energy is required for. This is a big and difficult problem, but we can already see an answer to it with the first identification of a site for the AI growth zone.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, I declare my technology interests as set out the register. I welcome the plan; it has 50 excellent recommendations, but does the Minister not agree that to bring these to life we need an arrowhead focus from government on broad AI legislation—much broader than what is currently planned—that includes an AI authority that is agile, nimbly focused and horizontally applicable; AI-responsible officers; the protection of creatives; and right-sized regulation that is good for citizens, innovators and consumers, in order to deliver according to the fundamental truth that these are our data, our decisions and our AI futures?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

I certainly agree that it is a significant challenge, and I add one other thing. The challenge is not only one of regulation of procurement and making sure that we have the data systems correct; it is one of making sure that we actually deliver, rather than talking about it. Delivery will be key, and we need a proper mechanism to deliver this in the form of a mission with real delivery outcomes. That is why I was pleased to see that we have very tight timelines on all the recommendations in the report. We must make sure that that happens and, as we do so, that we bring in the other necessary controls and actions to propel every part of this, from funding start-ups right the way through to procurement, and, as the noble Lord said, ensuring that we look after the privacy and autonomy of the data.

Artificial Intelligence: Regulation

Lord Holmes of Richmond Excerpts
Thursday 17th October 2024

(3 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- Hansard - -

To ask His Majesty’s Government whether they plan to regulate artificial intelligence and, if so, which uses they intend to regulate.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare my technology interests as set out in the register.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as set out in the King’s Speech, we will establish legislation to ensure the safe development of AI models by introducing targeted requirements on companies developing the most powerful AI systems, and we will consult on the proposals in due course. This will build on our ongoing commitment to make sure that the UK’s regulators have the expertise and resources to effectively regulate AI in their respective domains.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, with individuals having loan applications rejected off the back of AI decisions and creatives having their works ingested by GenAI with no consent or remuneration, would not the Minister agree that we need economy-wide and society-wide AI legislation and regulation for the benefit of citizens, consumers, creatives, innovators and investors—for all our AI futures?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

Thank you. It is an important area, and one where we have huge opportunities for growth. There is definitely the need for regulators to become upskilled in the ability to look at AI and understand how it impacts their areas. That is the reason we created the Regulatory Innovation Office, announced last week, to make sure that there are the capabilities and expertise in sector-dependent regulators. We also believe that there is a need for regulation for the most advanced models, which are general purpose, and of course cross many different areas as well.

King’s Speech (4th Day)

Lord Holmes of Richmond Excerpts
Monday 22nd July 2024

(6 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my interest as set out in the register as an adviser to LEMI Ltd. I congratulate the noble Lords, Lord Vallance and Lord Livermore, on their new ministerial positions; I look forward to working with them over the coming months. I particularly congratulate the noble Lord, Lord Vallance, on his excellent maiden speech, and likewise my noble friend Lord Petitgas on his tremendous contribution. I look forward to more from both.

I will concentrate on three areas, all of which touch on productivity, possibilities, potential and growth—that is economic, social and psychological growth. The first, as rightly identified by my noble friend Lord Shinkwin, is the issue of disabled people and employment. We currently have an employment pay gap for disabled people of over 13%. I welcome the forthcoming Bill from the Government and look forward to seeing the detail, but could the Minister say what plans the Government have to close that disability employment pay gap?

More than that is the employment gap for disabled people; only just over half of disabled people of working age are in employment, compared to more than eight in 10 non-disabled people. What is the Government’s plan to address this? The previous Government made some progress, but nowhere near enough. Governments of all persuasions cannot continue to waste this talent, decade after decade. It is clear that, when we have a tight labour market, we must look to the talent pools. Disabled people are a bright, deep and broad talent pool, from which the country needs to benefit.

The second area is the question of international trade. Last year’s Electronic Trade Documents Act was described variously by me as the most important law that no one has ever heard of and the blockchain Bill that does not mention blockchain. However, it is extraordinarily important, because it is probably the first time that the UK has legislated for the possibilities of new technologies, if you will. Why is it significant? It can unlock billions in liquidity and address the trade finance gap. Could the Minister say what the Government’s plan is to enable all enterprises, particularly small and medium-sized enterprises—many of which do not, or believe they could ever, export—to be aware of the possibilities of this new legislative opportunity? What work is happening from the Foreign Office to ensure that other nations—our friends around the world—are aware of opportunities they could benefit from if they passed similar electronic trade documents legislation?

The third area, as has already understandably been touched on—not least by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones —is the question of artificial intelligence. It is one of the greatest challenges and opportunities in our human hands. Human-led or human-in-the-loop technologies must be the way that we look at artificial intelligence.

I note that the report of the Chief Scientific Adviser, in March 2023, highlighted the opportunities from artificial intelligence and rightly identified that urgent action was required within the next 12 to 24 months. Is that still the Government’s position? We have such an opportunity to change so many of the difficulties that have dogged our society and economies for decades, if we understand how to fully deploy AI and do that with the right-sized regulation and legislative framework. To my mind, it is not time to wait and see, as the previous Government did; it is not time to look just at high-risk models, important though they are, as the current Government are doing. We need broad, cross-cutting, horizontally focused legislation and right-sized regulation to ensure that we benefit from the opportunities and put the citizen, the consumer and creatives at the heart of everything that we do in AI.

If that is not the Government’s plan, what they would say to creatives whose IP and copyrighted works are being taken with no consent and no remuneration, not least by large language models? What do the Government say to those who find themselves on the wrong end of an AI decision, often without even knowing that AI has been involved in the mix? Even if they found out that AI was there, they do not have any right of recourse or regulator to go to. What is the position if the Government and society do not have a true, invigorated public debate around artificial intelligence, to answer the question, “What’s in this for me?”, being asked by people up and down the country? If there is no trust, people are unlikely to avail themselves of the opportunities of AI and will certainly find themselves on the wrong end of its burdens.

This must be principles-based and outcomes-focused, in which inputs are understood and, where necessary, remunerated. Look at the Government’s regulatory innovation office; why not make it an AI authority and the centre of excellence, and of experts, which is the custodian of the principles of trust, transparency, innovation and interoperability, with an international perspective, accountability and accessibility? We have such an opportunity in the UK, with our great tech sector, universities and English common law, to play a critical role with AI. Does the Minister agree that it is time to look broad, to legislate and to lead? This is our data, our decisions and our AI futures.

Artificial Intelligence (Regulation) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill be now read a third time.

A privilege amendment was made.
--- Later in debate ---
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill do now pass.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my technology interests as adviser to Boston Ltd. I thank all the organisations and individuals that took the trouble to meet with me ahead of the Second Reading of my Bill, shared their expertise and insight, and added to the positive, almost unified, voice that we had at Second Reading in March. I thank colleagues around the House and in the other place for their support, and particularly thank the Labour and Liberal Democrat Front Benches for their support on all the principles set out in the Bill. I also thank my noble friend the Minister for the time he took to meet with me at all stages of the Bill.

It is clear that, when it comes to artificial intelligence, it is time to legislate—it is time to lead. We know what we need to do, and we know what we need to know, to legislate. We know the impact that AI is already having on our creatives, on our IP, on our copyright, across all that important part of our economy. We know the impact that having no labelling on IP products is having. Crucially, we know the areas where there is no competent legislation or regulator when it comes to AI decisions. Thus, there is no right of redress for consumers, individuals and citizens.

Similarly, it is also time to legislate to end the illogicality that grew out of the Bletchley summit—successful of itself, but strange to put only a voluntary code, rather than something statutory, in place as a result of that summit. It was strange also to have stood up such a successful summit and then not sought to legislate for all the other areas of artificial intelligence already impacting people’s lives—oftentimes without them even knowing that AI is involved.

It is time to bring forward good legislation and the positive powers of right-size regulation. What this always brings is clarity, certainty, consistency, security and safety. When it comes to artificial intelligence, we do not currently have that in the United Kingdom. Clarity and certainty, craved by consumers and businesses, is a driver of innovation, inward investment, pro-consumer protection and pro-citizen rights. If we do not legislate, the most likely, and certainly unintended, consequence is that businesses and organisations looking for a life raft will understandably, but unfortunately, align to the EU AI Act. That is not the optimal outcome that we can secure.

It is clear that when it comes to AI legislation and regulation things are moving internationally, across our Parliament and—dare I say—in No. 10. With sincere thanks again to all those who have helped so much to get the Bill to this stage, I say again that it is time to legislate—it is time to lead #OurAIFutures.

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I regret that I was unable to speak at Second Reading of the Bill. I am grateful to the government Benches for allowing my noble friend Lady Twycross to speak on my behalf on that occasion. However, I am pleased to be able to return to your Lordships’ House with a clean bill of health, to speak at Third Reading of this important Bill. I congratulate the noble Lord, Lord Holmes of Richmond, on the progress of his Private Member’s Bill.

Having read the whole debate in Hansard, I think it is clear that there is consensus about the need for some kind of AI regulation. The purpose, form and extent of this regulation will, of course, require further debate. AI has the potential to transform the world and deliver life-changing benefits for working people: whether delivering relief through earlier cancer diagnosis or relieving traffic congestion for more efficient deliveries, AI can be a force for good. However, the most powerful AI models could, if left unchecked, spread misinformation, undermine elections and help terrorists to build weapons.

A Labour Government would urgently introduce binding regulation and establish a new regulatory innovation office for AI. This would make Britain the best place in the world to innovate, by speeding up decisions and providing clear direction based on our modern industrial strategy. We believe this will enable us to harness the enormous power of AI, while limiting potential damage and malicious use, so that it can contribute to our plans to get the economy growing and give Britain its future back.

The Bill sends an important message about the Government’s responsibility to acknowledge and address how AI affects people’s jobs, lives, data and privacy, in the rapidly changing technological environment in which we live. Once again, I thank the noble Lord, Lord Holmes of Richmond, for bringing it forward, and I urge His Majesty’s Government to give proper consideration to the issues raised. As ever, I am grateful to noble Lords across the House for their contributions. We support and welcome the principles behind the Bill, and we wish it well as it goes to the other place.

AI: Intellectual Property Rights

Lord Holmes of Richmond Excerpts
Thursday 9th May 2024

(8 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- Hansard - -

To ask His Majesty’s Government what steps they are taking to protect intellectual property rights in relation to artificial intelligence, in particular large language models, since discontinuing plans to develop a code of practice.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare my technology interests, as set out in the register, as an adviser to Boston Ltd.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is a complex and challenging area. We strongly support AI innovation in the UK, but this cannot be at the expense of our world-leading creative industries. Our goal is that both sectors should be able to grow together in partnership. We are currently working with DCMS to develop a way forward on copyright and AI. We will engage closely with interested stakeholders as we develop our approach.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, our great UK creatives—musicians who make such sweet sounds where otherwise there may be silence; writers who fill the blank page with words of meaning that move us; our photographers; our visual artists—are a creative community contributing billions to the UK economy, growing at twice the rate of the UK economy. In the light of this, why are the Government content for their work, their IP and their copyright to be so disrespected and unprotected in the face of artificial intelligence?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend for the important point he raises, particularly stressing the importance to the United Kingdom of the creative industry, which contributes 6% of our GVA every year. The Government are far from content with this position and share the frustrations and concerns of many across the sector in trying to find a way forward on the AI and copyright issue. As I say, it is challenging and deeply complex. No jurisdiction anywhere has identified a truly satisfactory solution to this issue, but we continue to work internationally and nationally to seek one.

Data Protection and Digital Information Bill

Lord Holmes of Richmond Excerpts
Moved by
197A: After Clause 103, insert the following new Clause—
“Oversight of biometric technology use by the Information Commission(1) The Information Commission must establish a Biometrics Office.(2) The Biometrics Office is to be constituted by a committee of three appointed commissioners with relevant expertise.(3) It is the function of the Biometrics Office to—(a) establish and maintain a public register of relevant entities engaged in processing the biometric data of members of the public; (b) oversee and review the biometrics use of relevant entities;(c) produce a Code of Practice for the use of biometric technology by registered parties, which must include—(i) compulsory standards of accuracy and reliability for biometric technologies;(ii) a requirement for the proportionality of biometrics use to be assessed prior to use and annually thereafter, and a procedure for such assessment;(iii) a procedure for individual complaints about the use of biometrics by registered parties;(d) receive and publish annual reports from all relevant entities, which includes the relevant entity’s proportionality assessment of their biometrics use;(e) enforce registration and reporting by the issuing of enforcement notices and, where necessary, the imposition of fines for non-compliance with the registration and reporting requirements;(f) ensure lawfulness of biometrics use by relevant entities, including by issuing compliance and abatement notices where necessary.(4) The Secretary of State may by regulations add to the responsibilities of the Biometrics Office.(5) Regulations made under subsection (4) are subject to the affirmative resolution procedure.(6) For the purposes of this Part, “relevant entity” means any organisation or body corporate (whether public or private) which processes biometric data as defined in Article 9 GDPR, other than where the biometric processing undertaken by the organisation or body corporate is otherwise overseen by the Investigatory Powers Commissioner, because it is—(a) for the purposes of making or renewing a national security determination as defined by section 20(2) of the Protection of Freedoms Act 2012, or(b) for the purposes set out in section 20(6) of the Protection of Freedoms Act 2012.”
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to take part in today’s Committee proceedings. I declare my technology interests as an adviser to Boston Limited. It is self-evident that we have been talking about data but there could barely be a more significant piece of data than biometrics. In moving the amendment, I shall speak also to Amendments 197B and 197C, and give more than a nod to the other amendments in this group.

When we talk about data, it is always critical that we remember that it is largely our data. There could be no greater example of that than biometrics. More than data, they are parts and fragments of our very being. This is an opportune moment in the debate on the Bill to strengthen the approach to the treatment and the use of biometrics, not least because they are being increasingly used by private entities. That is what Amendments 197A to 197C are all about—the establishment of a biometrics office, a code of practice and oversight, and sanctions and fines to boot. This is of that level of significance. The Bill should have that strength when we are looking at such a significant part of our very human being and data protection.

Amendment 197B looks at reporting and regulatory requirements, and Amendment 197C at the case for entities that have already acted in the biometrics space prior to the passage of the Bill. In short, it is very simple. The amendments take principles that run through many elements of data protection and ensure that we have a clear statement on the use and deployment of biometrics in the Bill. There could be no more significant pieces of data. I look forward to the Minister’s response. I thank the Ada Lovelace Institute for its help in drafting the amendments, and I look forward to the debate on this group. I beg to move.

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I have added my name in support of the stand part notices of the noble Lord, Lord Clement-Jones, to Clauses 147, 148 and 149. These clauses would abolish the office of the Biometrics and Surveillance Camera Commissioner, along with the surveillance camera code of practice. I am going to speak mainly to the surveillance camera aspect, although I was taken by the speech of the noble Lord, Lord Holmes, who made some strong points.

The UK has become one of the most surveilled countries in the democratic world. There are estimated to be over 7 million CCTV cameras in operation. I give one example: the automated number plate recognition, ANPR, system records between 70 million and 80 million readings every day. Every car is recorded on average about three times a day. The data is held for two years. The previous Surveillance Camera Commissioner, Tony Porter, said about ANPR that it,

“must surely be one of the largest data gatherers of its citizens in the world. Mining of meta-data—overlaying against other databases can be far more intrusive than communication intercept”.

Professor Sampson, the previous commissioner, said about ANPR:

“There is no ANPR legislation or act, if you like. And similarly, there is no governance body to whom you can go to ask proper questions about the extent and its proliferation, about whether it should ever be expanded to include capture of other information such as telephone data being emitted by a vehicle or how it's going to deal with the arrival of automated autonomous vehicles”.


And when it came to independent oversight and accountability, he said:

“I’m the closest thing it’s got—and that’s nothing like enough”.


I am not against the use of surveillance cameras per se—it is unarguable that they are a valuable tool in the prevention and detection of crime—but there is clearly a balance to be found. If we chose to watch everything every person does all of the time, we could eliminate crime completely, but nobody is going to argue that to be desirable. We can clearly see how surveillance and biometrics can be misused by states that wish to control their populations—just look at China. So there is a balance to find between the protection of the public and intrusion into privacy.

Technology is moving incredibly rapidly, particularly with the ever-increasing capabilities of Al. As technology changes, so that balance between protection and privacy may also need to change. Yet Clause 148 will abolish the only real safeguards we have, and the only governance body that keeps an eye on that balance. This debate is not about where that balance ought to be; it is about making sure that there is some process to ensure that the balance is kept under independent review at a time when surveillance technologies and usage are developing incredibly rapidly.

I am sure that the Minister is going to argue that, as he said at Second Reading:

“Abolishing the Surveillance Camera Commissioner will not reduce data protection”.—[Official Report, 19/12/23; col. 2216.]


He is no doubt going to tell us that the roles of the commissioner will be adequately covered by the ICO. To be honest that completely misses the point. Surveillance is not just a question of data protection; it is a much wider question of privacy. Yes, the ICO may be able to manage the pure data protection matters, but it cannot possibly be the right body to keep the whole question of surveillance and privacy intrusion, and the related technologies, under independent review.

It is also not true that all the roles of the commissioner are being transferred to other bodies. The report by the Centre for Research into Surveillance and Privacy, or CRISP, commissioned by the outgoing commissioner, is very clear that a number of important areas will be lost, particularly reviewing the police handling of DNA samples, DNA profiles and fingerprints; maintaining an up-to-date surveillance camera code of practice with standards and guidance for practitioners and encouraging compliance with that code; setting out technical and governance matters for most public body surveillance systems, including how to approach evolving technology, such as Al-driven systems including facial recognition technology; and providing guidance on technical and procurement matters to ensure that future surveillance systems are of the right standard and purchased from reliable suppliers. It is worth noting that it was the Surveillance Camera Commissioner who raised the issues around the use of Hikvision cameras, for example—not something that the ICO is likely to be able to do. Finally, we will also lose the commissioner providing reports to the Home Secretary and Parliament about public surveillance and biometrics matters.

Professor Sampson said, before he ended his time in office as commissioner:

“The lack of attention being paid to these important matters at such a crucial time is shocking, and the destruction of the surveillance camera code that we’ve all been using successfully for over a decade is tantamount to vandalism”.


He went on to say:

“It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general … It seems absolutely senseless to destroy it now”.


The security industry does not want to see these changes either, as it sees the benefits of having a clear code. The Security Systems and Alarms Inspection Board, said:

“Without the Surveillance Camera Commissioner you will go back to the old days when it was like the ‘wild west’, which means you can do anything with surveillance cameras so long as you don’t annoy the Information Commissioner … so, there will not be anyone looking at new emerging technologies, looking at their technical requirements or impacts, no one thinking about ethical implications for emerging technologies like face-recognition, it will be a free-for-all”.


The British Security Industry Association said:

“We are both disappointed and concerned about the proposed abolition of the B&SCC. Given the prolific emergence of biometric technologies associated with video surveillance, now is a crucial time for government, industry, and the independent commissioner(s) to work close together to ensure video surveillance is used appropriately, proportionately, and most important, ethically”.


I do not think I can put it better than that.

While there may be better ways to achieve the appropriate safeguards than the current commissioner arrangement, this Bill simply abolishes everything that we have now and replaces the safeguards only partially, and only from a data protection perspective. I am open to discussion about how we might fill the gaps, but the abolition currently proposed by the Bill is a massively retrograde and even dangerous step, removing the only safeguards we have against the uncontrolled creep towards ever more intrusive surveillance of innocent people. As technology increases the scope for surveillance, this must be the time for greater safeguards and more independent oversight, not less. The abolition of the commissioner and code should not happen unless there are clear, better, safeguards established to replace it, and this Bill simply does not do that.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.

When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.

Amendment 197A withdrawn.

Artificial Intelligence (Regulation) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill be now read a second time.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my technology interests as adviser to Boston Ltd. I thank all noble Lords who have signed up to speak; I eagerly anticipate all their contributions and, indeed, hearing from my noble friend the Minister. I also thank all the organisations that got in contact with me and other noble Lords for their briefings, as well as those that took time to meet me ahead of this Second Reading debate. Noble Lords and others who would like to follow this on social media can use #AIBill #AIFutures.

If we are to secure the opportunities and control the challenges of artificial intelligence, it is time to legislate and to lead. We need something that is principles-based and outcomes-focused, with input transparent, permissioned and wherever applicable paid for and understood.

There are at last three reasons why we should legislate on this: social, democratic and economic. On reason one, the social reason, some of the greatest benefits we could secure from AI come in this space, including truly personalised education for all, and healthcare. We saw only yesterday the exciting early results from the NHS Grampian breast-screening AI programme. Then there is mobility and net zero sustainability.

Reason two is about our democracy and jurisdiction. With 40% of the world’s democracies going to the polls this year, with deepfakes, cheap fakes, misinformation and disinformation, we are in a high-threat environment for our democracy. As our 2020 Democracy and Digital Technologies Select Committee report put it, with a proliferation of misinformation and disinformation, trust will evaporate and, without trust, democracy as we know it will simply disappear.

On our jurisdiction and system of law, the UK has a unique opportunity at this moment in time. We do not have to fear being in the first mover spotlight—the EU has taken that with its Act, in all its 892 pages. The US has had the executive order but is still yet to commit fully to this phase. The UK, with our common-law tradition, respected right around the world, has such an opportunity to legislate in a way that will be adaptive, versatile and able to develop through precedent and case law.

On reason three, our economy, PwC’s AI tracker says that by 2030, there will be a 14% increase in global GDP worth $15.7 trillion. The UK must act to ensure our share of that AI boom. To take just one technology, the chatbot global market grew tenfold in just four years. The Alan Turing Institute report on AI in the public sector, which came out just this week, says that 84% of government services could benefit from AI automation in over 200 different services. Regulated markets perform better. Right-sized regulation is good for innovation and good for inward investment.

Those are the three reasons. What about three individual impacts of AI right now? What if we find ourselves on the wrong end of an AI decision in a recruitment shortlisting, the wrong end of an AI decision in being turned down for a loan, or, even worse, the wrong end of an AI decision when awaiting a liver transplant? All these are illustrations of AI impacting individuals, often when they would not even know that AI was involved. We need to put paid to the myth, the false dichotomy, that you must have heavy, rules-based regulation or a free hand—that we have to pay tribute to the cry of the frontierists in every epoque: “Don’t fence me in”. Right-sized regulation is good socially, democratically and economically. Here is the thing: AI is to human intellect what steam was to human strength. You get the picture. Steam literally changed time. It is our time to act, and that is why I bring this Bill to your Lordships’ House today.

In constructing the Bill, I have sought to consult widely, to be very cognisant of the Government’s pro-innovation White Paper, of all the great work of BCS, technology, industry, civil society and more. I wanted the Bill to be threaded through with the principles of transparency and trustworthiness; inclusion and innovation; interoperability and international focus; accountability and assurance.

Turning to the clauses, Clause 1 sets up an AI authority. Lest any noble Lord suddenly feels that I am proposing a do-it-all, huge, cumbersome regulator, I am most certainly not. In many ways, it would not be much bigger in scope than what the DSIT unit is proposing: an agile, right-sized regulator, horizontally focused to look across all existing regulators, not least the economic regulators, to assess their competency to address the opportunities and challenges presented by AI and to highlight the gaps. And there are gaps, as rightly identified by the excellent Ada Lovelace Institute report. For example, where do you go if you are on the wrong end of that AI recruitment shortlisting decision? It must have the authority, similarly, to look across all relevant legislation—consumer protection and product safety, to name but two—to assess its competency to address the challenges and opportunities presented by AI.

The AI authority must have at its heart the principles set out in Clause 2: it must be not just the custodian of those principles, but a very lighthouse for them, and it must have an educational function and a pro-innovation purpose. Many of those principles will be very recognisable; they are taken from the Government’s White Paper but put on a statutory footing. If they are good enough to be in the White Paper, we should commit to them, believe in them and know that they will be our greatest guides for the positive path forward, when put in a statutory framework. We must have everything inclusive by design, and with a proportionality thread running through all the principles, so none of them can be deployed in a burdensome way.

Clause 3 concerns sandboxes, so brilliantly developed in the UK in 2016 with the fintech regulatory sandbox. If you want a measure of its success, it is replicated in well over 50 jurisdictions around the world. It enables innovation in a safe, regulated, supported environment: real customers, real market, real innovations, but in a splendid sandbox concept.

Clause 4 sets up the AI responsible officer, to be conceived of not as a person but as a role, to ensure the safe, ethical and unbiased deployment of AI in her or his organisation. It does not have to be burdensome, or a whole person in a start-up; but that function needs to be performed, with reporting requirements under the Companies Act that are well understood by any business. Again, crucially, it must be subject to that proportionality principle.

Clause 5 concerns labelling and IP, which is such a critical part of how we will get this right with AI. Labelling: so that if anybody is subject to a service or a good where AI is in the mix, it will be clearly labelled. AI can be part of the solution to providing this labelling approach. Where IP or third-party data is used, that has to be reported to the AI authority. Again, this can be done efficiently and effectively using the very technology itself. On the critical question of IP, I met with 25 organisations representing tens of thousands of our great creatives: the people that make us laugh, make us smile, challenge us, push us to places we never even knew existed; those who make music, such sweet music, where otherwise there may be silence. It is critical to understand that they want to be part of this AI transformation, but in a consented, negotiated, paid-for manner. As Dan Guthrie, director-general of the Alliance for Intellectual Property, put it, it is extraordinary that businesses together worth trillions take creatives’ IP without consent and without payment, while fiercely defending their own intellectual property. This Bill will change that.

Clause 6 concerns public engagement. For me, this is probably the most important clause in the Bill, because without public engagement, how can we have trustworthiness? People need to be able to ask, “What is in this for me? Why should I care? How is this impacting my life? How can I get involved?” We need to look at innovative ways to consult and engage. A good example, in Taiwan, is the Alignment Assemblies, but there are hundreds of novel approaches. Government consultations should have millions of responses, because this is both desirable and now, with the technology, analysable.

Clause 7 concerns interpretation. At this stage, I have drawn the definitions of AI deliberately broadly. We should certainly debate this, but as set out in Clause 7, much would and should be included in those definitions.

Clause 8 sets out the potential for regulating for offences and fines thereunder, to give teeth to so much of what I have already set out and, rightly, to pay the correct respect to all the devolved nations. So, such regulations would have to go through the Scottish Parliament, Senedd Cymru and the Northern Ireland Assembly.

That brings us to Clause 9, the final clause, which makes this a UK-wide Bill.

So, that is the Bill. We know how to do this. Just last year, the Electronic Trade Documents Act showed that we know how to legislate for the possibilities of these new technologies; and, my word, we know how to innovate in the UK—Turing, Lovelace, Berners-Lee, Demis at DeepMind, and so many more.

If we know how to do this, why are we not legislating? What will we know in, say, 12 months’ time that we do not know now about citizens’ rights, consumer protection, IP rights, being pro-innovation, labelling and the opportunity to transform public engagement? We need to act now, because we know what we need to know—if not now, when? The Bletchley summit last year was a success. Understandably, it focused on safety, but having done that it is imperative that we stand up all the other elements of AI already impacting people’s lives in so many ways, often without their knowledge.

Perhaps the greatest and finest learning from Bletchley is not so much the safety summit but what happened there two generations before, when a diverse team of talent gathered and deployed the technology of their day to defeat the greatest threat to our civilisation. Talent and technology brought forth light in one of the darkest hours of human history. As it was in Bletchley in the 1940s, so it is in the United Kingdom in the 2020s. It is time for human-led, human-in-the-loop, principle-based artificial intelligence. It is time to legislate and to lead; for transparency and trustworthiness, inclusion and innovation, interoperability and international focus, accountability and assurance; for AI developers, deployers and democracy itself; for citizens, creatives and our country—our data, our decisions, #ourAIfutures. That is what this Bill is all about. I beg to move.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, I thank all noble Lords who have contributed to this excellent debate. It is pretty clear that the issues are very much with us today and we have what we need to act today. To respond to a question kindly asked by the noble Lord, Lord Davies of Brixton, in my drafting I am probably allowing “relevant” regulators to do some quite heavy lifting, but what I envisage within that is certainly all the economic regulators, and indeed all regulators who are in a sector where AI is being developed, deployed and in use. Everybody who has taken part in this debate and beyond may benefit from having a comprehensive list of all the regulators across government. Perhaps I could ask that of the Minister. I think it would be illuminating for all of us.

At the autumn FT conference, my noble friend the Minister said that heavy-handed regulation could stifle innovation. Certainly, it could. Heavy-handed regulation would not only stifle innovation but would be a singular failure of that creation of the regulatory process. History tells us that right-size regulation is pro-citizen, pro-consumer and pro-innovation; it drives innovation and inward investment. I was taken by so much of what the Ada Lovelace Institute put in its report. The Government really have given themselves all the eyes and not the hands to act. It reminds me very much of a Yorkshire saying: see all, hear all, do nowt. What is required is for these technologies to be human led, in our human hands, and human in the loop throughout. Right-size regulation, because it is principles-based, is necessarily agile, adaptive and can move as the technology moves. It should be principles-based and outcomes focused, with inputs that are transparent, understood, permissioned and, wherever and whenever applicable, paid for.

My noble friend the Minister has said on many occasions that there will come a time when we will legislate on AI. Let 22 March 2024 be that time. It is time to legislate; it is time to lead.

Bill read a second time and committed to a Committee of the Whole House.