Artificial Intelligence: Regulation

Lord Holmes of Richmond Excerpts
Thursday 17th October 2024

(2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- Hansard - -

To ask His Majesty’s Government whether they plan to regulate artificial intelligence and, if so, which uses they intend to regulate.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare my technology interests as set out in the register.

Lord Vallance of Balham Portrait The Minister of State, Department for Science, Innovation and Technology (Lord Vallance of Balham) (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, as set out in the King’s Speech, we will establish legislation to ensure the safe development of AI models by introducing targeted requirements on companies developing the most powerful AI systems, and we will consult on the proposals in due course. This will build on our ongoing commitment to make sure that the UK’s regulators have the expertise and resources to effectively regulate AI in their respective domains.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, with individuals having loan applications rejected off the back of AI decisions and creatives having their works ingested by GenAI with no consent or remuneration, would not the Minister agree that we need economy-wide and society-wide AI legislation and regulation for the benefit of citizens, consumers, creatives, innovators and investors—for all our AI futures?

Lord Vallance of Balham Portrait Lord Vallance of Balham (Lab)
- View Speech - Hansard - - - Excerpts

Thank you. It is an important area, and one where we have huge opportunities for growth. There is definitely the need for regulators to become upskilled in the ability to look at AI and understand how it impacts their areas. That is the reason we created the Regulatory Innovation Office, announced last week, to make sure that there are the capabilities and expertise in sector-dependent regulators. We also believe that there is a need for regulation for the most advanced models, which are general purpose, and of course cross many different areas as well.

King’s Speech (4th Day)

Lord Holmes of Richmond Excerpts
Monday 22nd July 2024

(4 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my interest as set out in the register as an adviser to LEMI Ltd. I congratulate the noble Lords, Lord Vallance and Lord Livermore, on their new ministerial positions; I look forward to working with them over the coming months. I particularly congratulate the noble Lord, Lord Vallance, on his excellent maiden speech, and likewise my noble friend Lord Petitgas on his tremendous contribution. I look forward to more from both.

I will concentrate on three areas, all of which touch on productivity, possibilities, potential and growth—that is economic, social and psychological growth. The first, as rightly identified by my noble friend Lord Shinkwin, is the issue of disabled people and employment. We currently have an employment pay gap for disabled people of over 13%. I welcome the forthcoming Bill from the Government and look forward to seeing the detail, but could the Minister say what plans the Government have to close that disability employment pay gap?

More than that is the employment gap for disabled people; only just over half of disabled people of working age are in employment, compared to more than eight in 10 non-disabled people. What is the Government’s plan to address this? The previous Government made some progress, but nowhere near enough. Governments of all persuasions cannot continue to waste this talent, decade after decade. It is clear that, when we have a tight labour market, we must look to the talent pools. Disabled people are a bright, deep and broad talent pool, from which the country needs to benefit.

The second area is the question of international trade. Last year’s Electronic Trade Documents Act was described variously by me as the most important law that no one has ever heard of and the blockchain Bill that does not mention blockchain. However, it is extraordinarily important, because it is probably the first time that the UK has legislated for the possibilities of new technologies, if you will. Why is it significant? It can unlock billions in liquidity and address the trade finance gap. Could the Minister say what the Government’s plan is to enable all enterprises, particularly small and medium-sized enterprises—many of which do not, or believe they could ever, export—to be aware of the possibilities of this new legislative opportunity? What work is happening from the Foreign Office to ensure that other nations—our friends around the world—are aware of opportunities they could benefit from if they passed similar electronic trade documents legislation?

The third area, as has already understandably been touched on—not least by the noble Baroness, Lady Kidron, and the noble Lord, Lord Clement-Jones —is the question of artificial intelligence. It is one of the greatest challenges and opportunities in our human hands. Human-led or human-in-the-loop technologies must be the way that we look at artificial intelligence.

I note that the report of the Chief Scientific Adviser, in March 2023, highlighted the opportunities from artificial intelligence and rightly identified that urgent action was required within the next 12 to 24 months. Is that still the Government’s position? We have such an opportunity to change so many of the difficulties that have dogged our society and economies for decades, if we understand how to fully deploy AI and do that with the right-sized regulation and legislative framework. To my mind, it is not time to wait and see, as the previous Government did; it is not time to look just at high-risk models, important though they are, as the current Government are doing. We need broad, cross-cutting, horizontally focused legislation and right-sized regulation to ensure that we benefit from the opportunities and put the citizen, the consumer and creatives at the heart of everything that we do in AI.

If that is not the Government’s plan, what they would say to creatives whose IP and copyrighted works are being taken with no consent and no remuneration, not least by large language models? What do the Government say to those who find themselves on the wrong end of an AI decision, often without even knowing that AI has been involved in the mix? Even if they found out that AI was there, they do not have any right of recourse or regulator to go to. What is the position if the Government and society do not have a true, invigorated public debate around artificial intelligence, to answer the question, “What’s in this for me?”, being asked by people up and down the country? If there is no trust, people are unlikely to avail themselves of the opportunities of AI and will certainly find themselves on the wrong end of its burdens.

This must be principles-based and outcomes-focused, in which inputs are understood and, where necessary, remunerated. Look at the Government’s regulatory innovation office; why not make it an AI authority and the centre of excellence, and of experts, which is the custodian of the principles of trust, transparency, innovation and interoperability, with an international perspective, accountability and accessibility? We have such an opportunity in the UK, with our great tech sector, universities and English common law, to play a critical role with AI. Does the Minister agree that it is time to look broad, to legislate and to lead? This is our data, our decisions and our AI futures.

Artificial Intelligence (Regulation) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill be now read a third time.

A privilege amendment was made.
--- Later in debate ---
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill do now pass.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my technology interests as adviser to Boston Ltd. I thank all the organisations and individuals that took the trouble to meet with me ahead of the Second Reading of my Bill, shared their expertise and insight, and added to the positive, almost unified, voice that we had at Second Reading in March. I thank colleagues around the House and in the other place for their support, and particularly thank the Labour and Liberal Democrat Front Benches for their support on all the principles set out in the Bill. I also thank my noble friend the Minister for the time he took to meet with me at all stages of the Bill.

It is clear that, when it comes to artificial intelligence, it is time to legislate—it is time to lead. We know what we need to do, and we know what we need to know, to legislate. We know the impact that AI is already having on our creatives, on our IP, on our copyright, across all that important part of our economy. We know the impact that having no labelling on IP products is having. Crucially, we know the areas where there is no competent legislation or regulator when it comes to AI decisions. Thus, there is no right of redress for consumers, individuals and citizens.

Similarly, it is also time to legislate to end the illogicality that grew out of the Bletchley summit—successful of itself, but strange to put only a voluntary code, rather than something statutory, in place as a result of that summit. It was strange also to have stood up such a successful summit and then not sought to legislate for all the other areas of artificial intelligence already impacting people’s lives—oftentimes without them even knowing that AI is involved.

It is time to bring forward good legislation and the positive powers of right-size regulation. What this always brings is clarity, certainty, consistency, security and safety. When it comes to artificial intelligence, we do not currently have that in the United Kingdom. Clarity and certainty, craved by consumers and businesses, is a driver of innovation, inward investment, pro-consumer protection and pro-citizen rights. If we do not legislate, the most likely, and certainly unintended, consequence is that businesses and organisations looking for a life raft will understandably, but unfortunately, align to the EU AI Act. That is not the optimal outcome that we can secure.

It is clear that when it comes to AI legislation and regulation things are moving internationally, across our Parliament and—dare I say—in No. 10. With sincere thanks again to all those who have helped so much to get the Bill to this stage, I say again that it is time to legislate—it is time to lead #OurAIFutures.

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - - - Excerpts

My Lords, I regret that I was unable to speak at Second Reading of the Bill. I am grateful to the government Benches for allowing my noble friend Lady Twycross to speak on my behalf on that occasion. However, I am pleased to be able to return to your Lordships’ House with a clean bill of health, to speak at Third Reading of this important Bill. I congratulate the noble Lord, Lord Holmes of Richmond, on the progress of his Private Member’s Bill.

Having read the whole debate in Hansard, I think it is clear that there is consensus about the need for some kind of AI regulation. The purpose, form and extent of this regulation will, of course, require further debate. AI has the potential to transform the world and deliver life-changing benefits for working people: whether delivering relief through earlier cancer diagnosis or relieving traffic congestion for more efficient deliveries, AI can be a force for good. However, the most powerful AI models could, if left unchecked, spread misinformation, undermine elections and help terrorists to build weapons.

A Labour Government would urgently introduce binding regulation and establish a new regulatory innovation office for AI. This would make Britain the best place in the world to innovate, by speeding up decisions and providing clear direction based on our modern industrial strategy. We believe this will enable us to harness the enormous power of AI, while limiting potential damage and malicious use, so that it can contribute to our plans to get the economy growing and give Britain its future back.

The Bill sends an important message about the Government’s responsibility to acknowledge and address how AI affects people’s jobs, lives, data and privacy, in the rapidly changing technological environment in which we live. Once again, I thank the noble Lord, Lord Holmes of Richmond, for bringing it forward, and I urge His Majesty’s Government to give proper consideration to the issues raised. As ever, I am grateful to noble Lords across the House for their contributions. We support and welcome the principles behind the Bill, and we wish it well as it goes to the other place.

AI: Intellectual Property Rights

Lord Holmes of Richmond Excerpts
Thursday 9th May 2024

(7 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Asked by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- Hansard - -

To ask His Majesty’s Government what steps they are taking to protect intellectual property rights in relation to artificial intelligence, in particular large language models, since discontinuing plans to develop a code of practice.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I beg leave to ask the Question standing in my name on the Order Paper and declare my technology interests, as set out in the register, as an adviser to Boston Ltd.

Viscount Camrose Portrait The Parliamentary Under-Secretary of State, Department for Science, Innovation and Technology (Viscount Camrose) (Con)
- View Speech - Hansard - - - Excerpts

My Lords, this is a complex and challenging area. We strongly support AI innovation in the UK, but this cannot be at the expense of our world-leading creative industries. Our goal is that both sectors should be able to grow together in partnership. We are currently working with DCMS to develop a way forward on copyright and AI. We will engage closely with interested stakeholders as we develop our approach.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, our great UK creatives—musicians who make such sweet sounds where otherwise there may be silence; writers who fill the blank page with words of meaning that move us; our photographers; our visual artists—are a creative community contributing billions to the UK economy, growing at twice the rate of the UK economy. In the light of this, why are the Government content for their work, their IP and their copyright to be so disrespected and unprotected in the face of artificial intelligence?

Viscount Camrose Portrait Viscount Camrose (Con)
- View Speech - Hansard - - - Excerpts

I thank my noble friend for the important point he raises, particularly stressing the importance to the United Kingdom of the creative industry, which contributes 6% of our GVA every year. The Government are far from content with this position and share the frustrations and concerns of many across the sector in trying to find a way forward on the AI and copyright issue. As I say, it is challenging and deeply complex. No jurisdiction anywhere has identified a truly satisfactory solution to this issue, but we continue to work internationally and nationally to seek one.

Data Protection and Digital Information Bill

Lord Holmes of Richmond Excerpts
Moved by
197A: After Clause 103, insert the following new Clause—
“Oversight of biometric technology use by the Information Commission(1) The Information Commission must establish a Biometrics Office.(2) The Biometrics Office is to be constituted by a committee of three appointed commissioners with relevant expertise.(3) It is the function of the Biometrics Office to—(a) establish and maintain a public register of relevant entities engaged in processing the biometric data of members of the public; (b) oversee and review the biometrics use of relevant entities;(c) produce a Code of Practice for the use of biometric technology by registered parties, which must include—(i) compulsory standards of accuracy and reliability for biometric technologies;(ii) a requirement for the proportionality of biometrics use to be assessed prior to use and annually thereafter, and a procedure for such assessment;(iii) a procedure for individual complaints about the use of biometrics by registered parties;(d) receive and publish annual reports from all relevant entities, which includes the relevant entity’s proportionality assessment of their biometrics use;(e) enforce registration and reporting by the issuing of enforcement notices and, where necessary, the imposition of fines for non-compliance with the registration and reporting requirements;(f) ensure lawfulness of biometrics use by relevant entities, including by issuing compliance and abatement notices where necessary.(4) The Secretary of State may by regulations add to the responsibilities of the Biometrics Office.(5) Regulations made under subsection (4) are subject to the affirmative resolution procedure.(6) For the purposes of this Part, “relevant entity” means any organisation or body corporate (whether public or private) which processes biometric data as defined in Article 9 GDPR, other than where the biometric processing undertaken by the organisation or body corporate is otherwise overseen by the Investigatory Powers Commissioner, because it is—(a) for the purposes of making or renewing a national security determination as defined by section 20(2) of the Protection of Freedoms Act 2012, or(b) for the purposes set out in section 20(6) of the Protection of Freedoms Act 2012.”
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to take part in today’s Committee proceedings. I declare my technology interests as an adviser to Boston Limited. It is self-evident that we have been talking about data but there could barely be a more significant piece of data than biometrics. In moving the amendment, I shall speak also to Amendments 197B and 197C, and give more than a nod to the other amendments in this group.

When we talk about data, it is always critical that we remember that it is largely our data. There could be no greater example of that than biometrics. More than data, they are parts and fragments of our very being. This is an opportune moment in the debate on the Bill to strengthen the approach to the treatment and the use of biometrics, not least because they are being increasingly used by private entities. That is what Amendments 197A to 197C are all about—the establishment of a biometrics office, a code of practice and oversight, and sanctions and fines to boot. This is of that level of significance. The Bill should have that strength when we are looking at such a significant part of our very human being and data protection.

Amendment 197B looks at reporting and regulatory requirements, and Amendment 197C at the case for entities that have already acted in the biometrics space prior to the passage of the Bill. In short, it is very simple. The amendments take principles that run through many elements of data protection and ensure that we have a clear statement on the use and deployment of biometrics in the Bill. There could be no more significant pieces of data. I look forward to the Minister’s response. I thank the Ada Lovelace Institute for its help in drafting the amendments, and I look forward to the debate on this group. I beg to move.

Lord Vaux of Harrowden Portrait Lord Vaux of Harrowden (CB)
- Hansard - - - Excerpts

My Lords, I have added my name in support of the stand part notices of the noble Lord, Lord Clement-Jones, to Clauses 147, 148 and 149. These clauses would abolish the office of the Biometrics and Surveillance Camera Commissioner, along with the surveillance camera code of practice. I am going to speak mainly to the surveillance camera aspect, although I was taken by the speech of the noble Lord, Lord Holmes, who made some strong points.

The UK has become one of the most surveilled countries in the democratic world. There are estimated to be over 7 million CCTV cameras in operation. I give one example: the automated number plate recognition, ANPR, system records between 70 million and 80 million readings every day. Every car is recorded on average about three times a day. The data is held for two years. The previous Surveillance Camera Commissioner, Tony Porter, said about ANPR that it,

“must surely be one of the largest data gatherers of its citizens in the world. Mining of meta-data—overlaying against other databases can be far more intrusive than communication intercept”.

Professor Sampson, the previous commissioner, said about ANPR:

“There is no ANPR legislation or act, if you like. And similarly, there is no governance body to whom you can go to ask proper questions about the extent and its proliferation, about whether it should ever be expanded to include capture of other information such as telephone data being emitted by a vehicle or how it's going to deal with the arrival of automated autonomous vehicles”.


And when it came to independent oversight and accountability, he said:

“I’m the closest thing it’s got—and that’s nothing like enough”.


I am not against the use of surveillance cameras per se—it is unarguable that they are a valuable tool in the prevention and detection of crime—but there is clearly a balance to be found. If we chose to watch everything every person does all of the time, we could eliminate crime completely, but nobody is going to argue that to be desirable. We can clearly see how surveillance and biometrics can be misused by states that wish to control their populations—just look at China. So there is a balance to find between the protection of the public and intrusion into privacy.

Technology is moving incredibly rapidly, particularly with the ever-increasing capabilities of Al. As technology changes, so that balance between protection and privacy may also need to change. Yet Clause 148 will abolish the only real safeguards we have, and the only governance body that keeps an eye on that balance. This debate is not about where that balance ought to be; it is about making sure that there is some process to ensure that the balance is kept under independent review at a time when surveillance technologies and usage are developing incredibly rapidly.

I am sure that the Minister is going to argue that, as he said at Second Reading:

“Abolishing the Surveillance Camera Commissioner will not reduce data protection”.—[Official Report, 19/12/23; col. 2216.]


He is no doubt going to tell us that the roles of the commissioner will be adequately covered by the ICO. To be honest that completely misses the point. Surveillance is not just a question of data protection; it is a much wider question of privacy. Yes, the ICO may be able to manage the pure data protection matters, but it cannot possibly be the right body to keep the whole question of surveillance and privacy intrusion, and the related technologies, under independent review.

It is also not true that all the roles of the commissioner are being transferred to other bodies. The report by the Centre for Research into Surveillance and Privacy, or CRISP, commissioned by the outgoing commissioner, is very clear that a number of important areas will be lost, particularly reviewing the police handling of DNA samples, DNA profiles and fingerprints; maintaining an up-to-date surveillance camera code of practice with standards and guidance for practitioners and encouraging compliance with that code; setting out technical and governance matters for most public body surveillance systems, including how to approach evolving technology, such as Al-driven systems including facial recognition technology; and providing guidance on technical and procurement matters to ensure that future surveillance systems are of the right standard and purchased from reliable suppliers. It is worth noting that it was the Surveillance Camera Commissioner who raised the issues around the use of Hikvision cameras, for example—not something that the ICO is likely to be able to do. Finally, we will also lose the commissioner providing reports to the Home Secretary and Parliament about public surveillance and biometrics matters.

Professor Sampson said, before he ended his time in office as commissioner:

“The lack of attention being paid to these important matters at such a crucial time is shocking, and the destruction of the surveillance camera code that we’ve all been using successfully for over a decade is tantamount to vandalism”.


He went on to say:

“It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general … It seems absolutely senseless to destroy it now”.


The security industry does not want to see these changes either, as it sees the benefits of having a clear code. The Security Systems and Alarms Inspection Board, said:

“Without the Surveillance Camera Commissioner you will go back to the old days when it was like the ‘wild west’, which means you can do anything with surveillance cameras so long as you don’t annoy the Information Commissioner … so, there will not be anyone looking at new emerging technologies, looking at their technical requirements or impacts, no one thinking about ethical implications for emerging technologies like face-recognition, it will be a free-for-all”.


The British Security Industry Association said:

“We are both disappointed and concerned about the proposed abolition of the B&SCC. Given the prolific emergence of biometric technologies associated with video surveillance, now is a crucial time for government, industry, and the independent commissioner(s) to work close together to ensure video surveillance is used appropriately, proportionately, and most important, ethically”.


I do not think I can put it better than that.

While there may be better ways to achieve the appropriate safeguards than the current commissioner arrangement, this Bill simply abolishes everything that we have now and replaces the safeguards only partially, and only from a data protection perspective. I am open to discussion about how we might fill the gaps, but the abolition currently proposed by the Bill is a massively retrograde and even dangerous step, removing the only safeguards we have against the uncontrolled creep towards ever more intrusive surveillance of innocent people. As technology increases the scope for surveillance, this must be the time for greater safeguards and more independent oversight, not less. The abolition of the commissioner and code should not happen unless there are clear, better, safeguards established to replace it, and this Bill simply does not do that.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I thank all noble Lords who participated in the excellent debate on this set of amendments. I also thank my noble friend the Minister for part of his response; he furiously agreed with at least a substantial part of my amendments, even though he may not have appreciated it at the time. I look forward to some fruitful and positive discussions on some of those elements between Committee and Report.

When a Bill passes into statute, a Minister and the Government may wish for a number of things in terms of how it is seen and described. One thing that I do not imagine is on the list is for it to be said that this statute generates significant gaps—those words were put perfectly by the noble Viscount, Lord Stansgate. That it generates significant gaps is certainly the current position. I hope that we have conversations between Committee and Report to address at least some of those gaps and restate some of the positions that exist, before the Bill passes. That would be positive for individuals, citizens and the whole of the country. For the moment, I beg leave to withdraw my amendment and look forward to those subsequent conversations.

Amendment 197A withdrawn.

Artificial Intelligence (Regulation) Bill [HL]

Lord Holmes of Richmond Excerpts
Moved by
Lord Holmes of Richmond Portrait Lord Holmes of Richmond
- View Speech - Hansard - -

That the Bill be now read a second time.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I declare my technology interests as adviser to Boston Ltd. I thank all noble Lords who have signed up to speak; I eagerly anticipate all their contributions and, indeed, hearing from my noble friend the Minister. I also thank all the organisations that got in contact with me and other noble Lords for their briefings, as well as those that took time to meet me ahead of this Second Reading debate. Noble Lords and others who would like to follow this on social media can use #AIBill #AIFutures.

If we are to secure the opportunities and control the challenges of artificial intelligence, it is time to legislate and to lead. We need something that is principles-based and outcomes-focused, with input transparent, permissioned and wherever applicable paid for and understood.

There are at last three reasons why we should legislate on this: social, democratic and economic. On reason one, the social reason, some of the greatest benefits we could secure from AI come in this space, including truly personalised education for all, and healthcare. We saw only yesterday the exciting early results from the NHS Grampian breast-screening AI programme. Then there is mobility and net zero sustainability.

Reason two is about our democracy and jurisdiction. With 40% of the world’s democracies going to the polls this year, with deepfakes, cheap fakes, misinformation and disinformation, we are in a high-threat environment for our democracy. As our 2020 Democracy and Digital Technologies Select Committee report put it, with a proliferation of misinformation and disinformation, trust will evaporate and, without trust, democracy as we know it will simply disappear.

On our jurisdiction and system of law, the UK has a unique opportunity at this moment in time. We do not have to fear being in the first mover spotlight—the EU has taken that with its Act, in all its 892 pages. The US has had the executive order but is still yet to commit fully to this phase. The UK, with our common-law tradition, respected right around the world, has such an opportunity to legislate in a way that will be adaptive, versatile and able to develop through precedent and case law.

On reason three, our economy, PwC’s AI tracker says that by 2030, there will be a 14% increase in global GDP worth $15.7 trillion. The UK must act to ensure our share of that AI boom. To take just one technology, the chatbot global market grew tenfold in just four years. The Alan Turing Institute report on AI in the public sector, which came out just this week, says that 84% of government services could benefit from AI automation in over 200 different services. Regulated markets perform better. Right-sized regulation is good for innovation and good for inward investment.

Those are the three reasons. What about three individual impacts of AI right now? What if we find ourselves on the wrong end of an AI decision in a recruitment shortlisting, the wrong end of an AI decision in being turned down for a loan, or, even worse, the wrong end of an AI decision when awaiting a liver transplant? All these are illustrations of AI impacting individuals, often when they would not even know that AI was involved. We need to put paid to the myth, the false dichotomy, that you must have heavy, rules-based regulation or a free hand—that we have to pay tribute to the cry of the frontierists in every epoque: “Don’t fence me in”. Right-sized regulation is good socially, democratically and economically. Here is the thing: AI is to human intellect what steam was to human strength. You get the picture. Steam literally changed time. It is our time to act, and that is why I bring this Bill to your Lordships’ House today.

In constructing the Bill, I have sought to consult widely, to be very cognisant of the Government’s pro-innovation White Paper, of all the great work of BCS, technology, industry, civil society and more. I wanted the Bill to be threaded through with the principles of transparency and trustworthiness; inclusion and innovation; interoperability and international focus; accountability and assurance.

Turning to the clauses, Clause 1 sets up an AI authority. Lest any noble Lord suddenly feels that I am proposing a do-it-all, huge, cumbersome regulator, I am most certainly not. In many ways, it would not be much bigger in scope than what the DSIT unit is proposing: an agile, right-sized regulator, horizontally focused to look across all existing regulators, not least the economic regulators, to assess their competency to address the opportunities and challenges presented by AI and to highlight the gaps. And there are gaps, as rightly identified by the excellent Ada Lovelace Institute report. For example, where do you go if you are on the wrong end of that AI recruitment shortlisting decision? It must have the authority, similarly, to look across all relevant legislation—consumer protection and product safety, to name but two—to assess its competency to address the challenges and opportunities presented by AI.

The AI authority must have at its heart the principles set out in Clause 2: it must be not just the custodian of those principles, but a very lighthouse for them, and it must have an educational function and a pro-innovation purpose. Many of those principles will be very recognisable; they are taken from the Government’s White Paper but put on a statutory footing. If they are good enough to be in the White Paper, we should commit to them, believe in them and know that they will be our greatest guides for the positive path forward, when put in a statutory framework. We must have everything inclusive by design, and with a proportionality thread running through all the principles, so none of them can be deployed in a burdensome way.

Clause 3 concerns sandboxes, so brilliantly developed in the UK in 2016 with the fintech regulatory sandbox. If you want a measure of its success, it is replicated in well over 50 jurisdictions around the world. It enables innovation in a safe, regulated, supported environment: real customers, real market, real innovations, but in a splendid sandbox concept.

Clause 4 sets up the AI responsible officer, to be conceived of not as a person but as a role, to ensure the safe, ethical and unbiased deployment of AI in her or his organisation. It does not have to be burdensome, or a whole person in a start-up; but that function needs to be performed, with reporting requirements under the Companies Act that are well understood by any business. Again, crucially, it must be subject to that proportionality principle.

Clause 5 concerns labelling and IP, which is such a critical part of how we will get this right with AI. Labelling: so that if anybody is subject to a service or a good where AI is in the mix, it will be clearly labelled. AI can be part of the solution to providing this labelling approach. Where IP or third-party data is used, that has to be reported to the AI authority. Again, this can be done efficiently and effectively using the very technology itself. On the critical question of IP, I met with 25 organisations representing tens of thousands of our great creatives: the people that make us laugh, make us smile, challenge us, push us to places we never even knew existed; those who make music, such sweet music, where otherwise there may be silence. It is critical to understand that they want to be part of this AI transformation, but in a consented, negotiated, paid-for manner. As Dan Guthrie, director-general of the Alliance for Intellectual Property, put it, it is extraordinary that businesses together worth trillions take creatives’ IP without consent and without payment, while fiercely defending their own intellectual property. This Bill will change that.

Clause 6 concerns public engagement. For me, this is probably the most important clause in the Bill, because without public engagement, how can we have trustworthiness? People need to be able to ask, “What is in this for me? Why should I care? How is this impacting my life? How can I get involved?” We need to look at innovative ways to consult and engage. A good example, in Taiwan, is the Alignment Assemblies, but there are hundreds of novel approaches. Government consultations should have millions of responses, because this is both desirable and now, with the technology, analysable.

Clause 7 concerns interpretation. At this stage, I have drawn the definitions of AI deliberately broadly. We should certainly debate this, but as set out in Clause 7, much would and should be included in those definitions.

Clause 8 sets out the potential for regulating for offences and fines thereunder, to give teeth to so much of what I have already set out and, rightly, to pay the correct respect to all the devolved nations. So, such regulations would have to go through the Scottish Parliament, Senedd Cymru and the Northern Ireland Assembly.

That brings us to Clause 9, the final clause, which makes this a UK-wide Bill.

So, that is the Bill. We know how to do this. Just last year, the Electronic Trade Documents Act showed that we know how to legislate for the possibilities of these new technologies; and, my word, we know how to innovate in the UK—Turing, Lovelace, Berners-Lee, Demis at DeepMind, and so many more.

If we know how to do this, why are we not legislating? What will we know in, say, 12 months’ time that we do not know now about citizens’ rights, consumer protection, IP rights, being pro-innovation, labelling and the opportunity to transform public engagement? We need to act now, because we know what we need to know—if not now, when? The Bletchley summit last year was a success. Understandably, it focused on safety, but having done that it is imperative that we stand up all the other elements of AI already impacting people’s lives in so many ways, often without their knowledge.

Perhaps the greatest and finest learning from Bletchley is not so much the safety summit but what happened there two generations before, when a diverse team of talent gathered and deployed the technology of their day to defeat the greatest threat to our civilisation. Talent and technology brought forth light in one of the darkest hours of human history. As it was in Bletchley in the 1940s, so it is in the United Kingdom in the 2020s. It is time for human-led, human-in-the-loop, principle-based artificial intelligence. It is time to legislate and to lead; for transparency and trustworthiness, inclusion and innovation, interoperability and international focus, accountability and assurance; for AI developers, deployers and democracy itself; for citizens, creatives and our country—our data, our decisions, #ourAIfutures. That is what this Bill is all about. I beg to move.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, I thank all noble Lords who have contributed to this excellent debate. It is pretty clear that the issues are very much with us today and we have what we need to act today. To respond to a question kindly asked by the noble Lord, Lord Davies of Brixton, in my drafting I am probably allowing “relevant” regulators to do some quite heavy lifting, but what I envisage within that is certainly all the economic regulators, and indeed all regulators who are in a sector where AI is being developed, deployed and in use. Everybody who has taken part in this debate and beyond may benefit from having a comprehensive list of all the regulators across government. Perhaps I could ask that of the Minister. I think it would be illuminating for all of us.

At the autumn FT conference, my noble friend the Minister said that heavy-handed regulation could stifle innovation. Certainly, it could. Heavy-handed regulation would not only stifle innovation but would be a singular failure of that creation of the regulatory process. History tells us that right-size regulation is pro-citizen, pro-consumer and pro-innovation; it drives innovation and inward investment. I was taken by so much of what the Ada Lovelace Institute put in its report. The Government really have given themselves all the eyes and not the hands to act. It reminds me very much of a Yorkshire saying: see all, hear all, do nowt. What is required is for these technologies to be human led, in our human hands, and human in the loop throughout. Right-size regulation, because it is principles-based, is necessarily agile, adaptive and can move as the technology moves. It should be principles-based and outcomes focused, with inputs that are transparent, understood, permissioned and, wherever and whenever applicable, paid for.

My noble friend the Minister has said on many occasions that there will come a time when we will legislate on AI. Let 22 March 2024 be that time. It is time to legislate; it is time to lead.

Bill read a second time and committed to a Committee of the Whole House.

Digital Exclusion (Communications and Digital Committee Report)

Lord Holmes of Richmond Excerpts
Thursday 8th February 2024

(10 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to take part in this debate, and I declare my technology interests as set out in the register as adviser to Boston Ltd. I congratulate my noble friend Lady Stowell, not just on this report but, as she rightly set out, quite a week for the committee. It shows how all these issues touch so many elements of our society and our economy.

I will pick out the most significant recommendation from the committee’s report, and that is the need for a strategy. Some 10 years ago, I sat on my first House of Lords Select Committee on Digital Skills. We began our work in 2014 and we finished in 2015 by talking of a strategy, and yet the strategy that was published before we began our deliberations is still the strategy we have today. There are clearly good examples of operational delivery happening across the country and across Whitehall, but without a strategy it is de facto non-strategic. Does the Minister not agree and see the urgent and pressing need to have a strategy so that we can be strategic about our approach?

A lifetime ago, EM Forster entreated that we should “Only connect!” These words can be helpful on our journey through digital inclusion, because we have never been more connected. Yet look at the rise of populism, nationalism and retreatism—horrific geopolitical outlooks. We have never been more connected and yet we have an epidemic of isolation and a mental well-being crisis.

Why? There are many factors, but a significant one is that this is connection without inclusion, enablement and empowerment. Those who find themselves at the sharpest end of digital exclusion are often those who would have the most to gain—older people, disabled people and those in lower socioeconomic groups. Even when they manage to get online, for want of information only 5% of those who could take advantage of social tariffs do so.

Take the example of a grocery shopping app. Imagine how you place your order and it comes to your door. Many people are familiar with this and the convenient offers within it, and it is a great way to shop. But imagine that you are not online, and so you cannot use that app; imagine that you have the app but do not have the digital skills to transact, and so it is not working for you; imagine that you have the skills but have no or low internet connectivity, and so the app is not working for you. For want of digital inclusion, more than a few folks are not getting their food.

Digital inclusion is about the stuff, the kit, the skills and the learning. It is about enabling people to have the comfort and confidence that come through digital inclusion in all aspects of their lives. It needs leadership at a national and local level. The hubs are an excellent suggestion, where they exist, but we need more of them on a strategic basis.

Everything starts with education. Does my noble friend the Minister agree that, if we are really to enable digital inclusion, and enable and empower our young people to take all the advantages of the fourth industrial revolution, we need a complete overhaul of our school curriculum? Great stuff happens, if you are lucky, but it is not strategic and it is in pockets.

The PISA results on numeracy and literacy are impressive. We know how to take this performance approach, but alongside numeracy and literacy we need data literacy, data competency, digital literacy and digital competency. They are equally important. Then young people will be enabled and empowered, with all their talent unleashed through digital inclusion and education.

Taking this broader, we have nothing short of an opportunity to completely reimagine our economy and society and the very social contract, transformed into a digital social contract between citizen and state, for the benefit of both.

I finish on the essence of all this—inclusion and innovation. More than that, it is inclusion for innovation. That must be the mission of us all.

Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- Hansard - - - Excerpts

I wonder if I can just slip in before Members on the Front Bench speak, particularly those who have signed the amendment. I refer again to my register of interests.

I support the principle that lies behind these amendments and want to reinforce the point that I made at Second Reading and that I sort of made on the first day in Committee. Any stray word in the Bill when enacted will be used by those with the deepest pockets—that is, the platforms—to hold up action against them by the regulator. I read this morning that the CMA has resumed its inquiry into the UK cloud market after an eight-month hiatus based on a legal argument put by Apple about the nature of the investigation.

It seems to me that Clause 19(5) is there to show the parameters on which the CMA can impose an obligation to do with fair dealing and open choices, and so on. It therefore seems that “proportionate”—or indeed perhaps even “appropriate”—is unnecessary because the CMA will be subject to judicial review on common-law principles if it makes an irrational or excessive decision and it may be subject to a legal appeal if people can argue that it has not applied the remedy within the parameters set by paragraphs (a), (b) and (c) of Clause 19(5). I am particularly concerned about whether there is anything in the Bill once enacted that allows either some uncertainty, which can be latched on to, or appeals—people refer to “judicial review plus” or appeals on the full merits, which are far more time-consuming and expensive and which will tie the regulator up in knots.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to take part in day two of Committee on the DMCC Bill. Again, I declare my interest as an adviser to Boston Limited.

It is a pleasure to follow the introduction from my noble friend Lord Faulks. I think is highly appropriate that we discuss proportionality. I have a number of amendments in my name in this group: Amendments 33, 52 and 220, and then the rather beautifully double Nelsonian, Amendment 222. Essentially, a considerable amount of work needs to be done before we can have proportionality going through the Bill in its current form. My amendments suggest not only addressing that but looking at counter- vailing benefits exemptions and financial penalties.

Agreeing with pretty much everything that has been said, and with the tone and spirit of all the amendments that have been introduced thus far, I will limit my remarks to Amendment 222. It suggests that regulations bringing into force Clauses 19, 21, 46 and 86

“may not be made until the Secretary of State has published guidance”

going into the detail of how all this will operate in reality.

Proportionality is obviously a key element, as has already been discussed, this is just as important, as we will come on to in the next group. My Amendment 222 straddles the groups a bit, under the vagaries of grouping amendments, but it is nevertheless all the better for it.

I look forward to hearing my noble friend the Minister’s response on proportionality, countervailing benefits exemptions and financial penalties, and on the need for clear, detailed guidance to come from the Secretary of State before any moves are made in any and all of these areas.

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, I am afraid I am going to play the role of Little Sir Echo here. I hope that the unanimity expressed so far will send a strong message to my noble friend the Minister. I support Amendment 16 in the name of the noble Lord, Lord Faulks, to which I have added my name, and Amendments 17, 53 and 54. I note my interests as declared at the start of Committee.

As I made clear in my remarks on Second Reading, we must, throughout the consideration of the Bill, steadfastly avoid importing anything into the CMA and DMU procedures that would allow the platforms to deploy delaying tactics and tie up the regulators in endless legal knots. Long legal wrangling will destroy the very essence of the Bill, and it is not mere speculation to suggest that this might happen. As we have seen elsewhere in the world, and indeed in publishers’ own existing dealings with the platforms, we do not need to gaze into a crystal ball; we can, as the noble Lord, Lord Tyrie, put it the other day, read the book.

In that light, as we have heard consistently this afternoon, I fear that the government amendments made in the other place, requiring the conduct requirements and PCIs to be proportionate rather than appropriate, do just that. They impose significant restrictions on the work of the CMA and, as an extremely helpful briefing—which I think all Members have had—from Which? put it, produce “a legal quagmire” that would allow the unaccountable platforms

“with their vast legal budgets … to push back against each and every decision the regulator takes”.

It is simply counterintuitive to the design of the flexible and participatory framework the legislation portends. As my noble friend Lady Stowell said, it certainly makes me very nervous.

The key point is that introducing the concept of proportionality is, frankly, totally otiose, as the noble Lord, Lord Faulks, put it so well, as proportionality is already tested by judicial review—something the CMA itself has already reiterated. The courts, in this novel area of legislation, will rely on Parliament clearly to state its intentions. Introducing the concept of proportionality not only is unnecessary but in fact muddies the waters and creates confusion that will be mercilessly used by the platforms. It certainly does not produce clarity. The Government really must think again.

--- Later in debate ---
Lord Vaizey of Didcot Portrait Lord Vaizey of Didcot (Con)
- Hansard - - - Excerpts

I want to get all the heckles out of the way; they have to be recorded in Hansard. I listened to the Minister’s explanation very carefully. He said that there is no single accepted definition of “proportionate”—that there are different definitions depending on case law and the common law. Is that not exactly what the problem is? The minute you put that word in the clause, you have, effectively, said that there are eight, seven or six definitions of proportionate. Guess what the platforms will do with that.

Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

May I build on that before my noble friend the Minister responds? What precisely was inappropriate about “appropriate”?

Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, this is not just to prevent the Minister getting up again; it is relevant to both points that have just been made. A number of noble Lords asked whether this huge volte-face by the Government between the publication of the Bill and the amendments made very late in the other place came about as a result of pressure from the platforms. Could he tell us whether the platforms lobbied for this change and whether he discussed it with them?

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, I shall speak briefly to this group of amendments and particularly commend those in the name of the noble Baroness, Lady Jones.

There are key themes that inevitably run through deliberations across groups in Committee, and it seems that, this afternoon, a recurrent theme has understandably been that the Bill is certainly better as was than currently as is. A number of amendments make that point very firmly.

If the Bill does not address at every point necessary the whole question of asymmetry in the nature of the relationship between the parties in all these complex arrangements, there is precious little point in proceeding beyond this point. The whole nature of the relationship and the negotiations therein is framed by the asymmetry of power, of resources and of what can be brought to bear by each party to proceedings. Hence, in this set of amendments, while different approaches are taken, similar ends are sought.

I look forward to hearing the Minister’s response and, as the noble Lord, Lord Clement-Jones, is “not” here, I also look forward very much to him “not” intervening on the Minister.

Digital Markets, Competition and Consumers Bill

Lord Holmes of Richmond Excerpts
Moved by
15: Clause 19, page 11, line 1, leave out subsections (5) to (8)
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

My Lords, it is a pleasure to take part in this first day of Committee on the Bill. As it is my first time speaking in Committee, I declare my technology interests as set out in the register, not least as an adviser to Boston Limited. In moving Amendment 15, I will also speak to Amendment 24, and I am very interested in the other amendments in this group.

Much of the discussions so far rest on the most important point of all when it comes to legislating. It reminds me of many of the discussions that we had in this very Room last year on the Financial Services and Markets Bill, as it was then, about accountability, the role of the Secretary of State and the role of the regulators. Much of this Bill as drafted, if not a pendulum, simultaneously swings significant powers to the regulator, and indeed to the Secretary of State. But the question that needs continually to come up in our deliberations in Committee and beyond is where Parliament is in this process. We hear every day how the physical building itself is crumbling, in need of desperate repair and in need of a decant, but, when it comes to this Bill, Parliament has already disappeared.

There is a massive need for accountability in many of the Bill’s clauses. Clause 19 is just one example, which is why my Amendment 15 seeks to take out a chunk of it to help in this process. Later in Committee, we will hear other amendments on parliamentary accountability. It is not only essential but, as has already been mentioned, goes to the heart of a trend that is happening across legislation, in different spheres, where huge powers are being given to our economic regulators without the right level of accountability.

What we saw as one of the major outputs of FSMA 2023, as it now is, was a new parliamentary committee: the financial services and markets committee. In many ways, you can see this as a process that may happen repetitively, but positively so, across a number of areas if this approach to legislation is perpetuated across those areas when it comes to competition. I look forward to my noble friend the Minister’s response to my Amendment 15 on that issue.

I move on to Amendment 24, which concerns a very different but critical area. It seeks to amend Clause 20, which makes brief mention of the accessibility of the information pertaining to these digital activities but is silent on the accessibility of the digital activities themselves. Does my noble friend the Minister agree that we need more on the face of the Bill when it comes to accessibility? With more services—critical parts of our lives—moving on to these digital platforms, it is essential that they are accessible to all users.

I use the term “user” deliberately because, as we have heard in previous debates, there is a great need for clarity around this legislation. “User” is used—indeed, peppered—throughout the legislation. This is right in that “user” is a term of art that would be understood across the country; however, it does not appear in the title of the Bill, which is at least interesting. We must ensure that all users or consumers are able to access all these digital platforms and services fully. Let us take banking as an example. It is far more difficult to get face-to-face banking services and access to cash, so much more is moved online. However, if those services are not accessible, what use are they to people who have been physically excluded and are now being financially and digitally excluded in the digital space?

When it comes to sporting events, mention has been made of sport in our debates on earlier amendments. I think everyone in the Committee would agree that VAR has not demonstrated technology at its brightest and best in the sporting context. I wonder whether, if we completely turned referees into bots, there would be questions about the visual acuity of the bot on the decisions that it similarly made when it went against our team. If we are to have so many ticketing services for sporting, musical and cultural events available largely, if not exclusively, online—and if, at the front end of that process, there is the all-too-familiar CAPTCHA, which we must go through to prove that we are not yet a bot—what will happen if that is not accessible? We will not get tickets.

I put it to my noble friend the Minister that there needs to be more in Clause 20 and other parts of the Bill around the accessibility of those digital services, activities and platforms. If we could fully embrace the concept of “inclusive by design”, this would evaporate as an issue. I beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this is quite a group of amendments. Clearly, it will take a bit of time to work our way through all of them. It is a pleasure to follow the noble Lord, Lord Holmes, who is so knowledgeable about digital aspects—I thought that he would slip stuff about the digital aspects of sport into his introduction.

I am in curate’s egg country, as far as the two amendments in the name of the noble Lord are concerned. I am not quite sure about Amendment 15, but I look forward to the Minister’s response. I think Amendment 24 is absolutely spot on and really important. I hope that the noble Lord succeeds in putting it into the Bill, eventually.

I will start by speaking to Amendments 21, 28 and 55 on interoperability, Amendment 30 on copyright and Amendment 20 in the name of the noble Lord, Lord Lansley. I will refer to Amendment 32 in the name of the noble Viscount, Lord Colville, but I will not speak on it for too long, because I do not want to steal his thunder. If possible, I will also speak to the amendments in the names of the noble Baroness, Lady Jones, and the noble Lord, Lord Vaizey, on leveraging. They are crucial if the Bill is to be truly effective.

Interoperability is the means by which websites interoperate, as part of the fundamental web architecture. Current problems arise when SMS players make browser changers and interfere with open web data, such as header bidding, which is used for interoperability among websites. Quality of service and experience can be misused for the benefit of the platforms; they can degrade the interoperability of different systems or make video or audio quality either higher or lower for the benefit of their own apps and products.

At Second Reading, my noble friend Lord Fox reminded us that Professor Furman, in evidence in Committee in the Commons, said that intervention on interoperability is a vital remedy. My noble friend went on to say that interfering with interoperability in all its forms should be policed by the CMA, which should be

“proactive with respect to promoting international standards and aiming to create that interoperability: for a start, by focusing on open access and operational transparency, working for standards that allow unrestricted participation and favouring the technologies and protocols that prevent a single person or group amending or reversing transactions executed and recorded”.—[Official Report, 5/12/23; col. 1396.]

At my noble friend’s request, the Minister, the noble Viscount, Lord Camrose, followed up with a letter on the subject on 7 December. He said:

“Standards are crucial to building the UK’s economic prosperity, safeguarding the UK’s national security, and protecting the UK’s norms and values. The Government strongly supports a multi-stakeholder approach to the development of technical standards, and it will be important that the CMA engages with this process where appropriate. The UK’s Plan for Digital Regulation, published in 2021, confirms the importance of considering standards as a complement or alternative to traditional regulation”.


It is good to see the Minister’s approach, but it is clear that there should be a stronger and more explicit reference to the promotion of interoperability in digital markets. The Bill introduces an interoperability requirement under Clause 20(3)(e) but, as it stands, this is very vague. Interoperability should be defined and the purpose of the requirement should be outlined; namely, to promote competition and innovation, so that content creators can provide their services across the world without interference and avoid platform dependency.

I move to Amendment 30. Breach of copyright online is a widespread problem. The noble Baroness, Lady Kidron, referred to the whole IP issue, which is increasing in the digital world, but the current conduct requirements are not wide enough. There should be a simple obligation on those using others’ copyright to request the use of that material. As the NMA says, the opacity of large language models is a major stumbling block when it comes to enforcing rights and ensuring consumer safety. AI developers should be compelled to make information about systems more readily available and accessible. Generative outputs should include clear and prominent attributions, which flag the original sources of the output. This is notable in the EU’s proposed AI Act.

This would allow citizens to understand whether the outputs are based on reliable information, apart from anything else.

If publishers are not fairly compensated for the use of the content by generative AI systems in particular—I look towards the noble Lord, Lord Black, at this point—and lose audiences to them, it will harm publisher sustainability and see less money invested in quality journalism. In turn, less trusted content will be available to train and update AI systems, harming innovation and increasing the chance that these systems produce unreliable results.

--- Later in debate ---
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- Hansard - -

I thought I would wait, just in case the noble Lord, Lord Clement-Jones, wanted to come in before the Minister sat down.

It has been an excellent debate, covering a wide range of connected issues, and I thank all noble Lords who have spoken and the Minister for his response. All the issues are connected by so many of the fundamentals that underpin not just this Bill, but the entirety of this digital project that we are all on: accessibility, interoperability, inclusion and intellectual property. I do not think we should ever stop mentioning copyright and intellectual property in these discussions; it is absolutely critical and is being decimated in so many ways right this very day.

Data, as was so eloquently set out by my noble friend Lord Lansley, is part of the critical underpinning. What is any of this without data? I certainly think that what we do not want to do with the Bill, as the Minister set out, is to come up with a definition of interoperability that is not interoperable—that would be an unfortunate slip of the pen. All these issues need to be at the forefront of all our deliberations; it unites all the amendments in this group and should unite all of our thoughts. They are the key threads that will not only make a success of this Bill but make a success of everything that we are trying to achieve with this digital project.

I know we are going to return to a number of these issues as we progress through Committee and into Report, but at this point—beating the Division Bell, still—I beg to withdraw my amendment.

Amendment 15 withdrawn.
Lord Holmes of Richmond Portrait Lord Holmes of Richmond (Con)
- View Speech - Hansard - -

My Lords, it is a pleasure to take part on Second Reading; I declare my interests in financial services and technology, in Ecospend Ltd and Boston Ltd. There is a fundamental truth at the heart of our deliberations, both on Second Reading and as we progress to Committee: that is it is our data. There are no great large language models; perhaps it would be more appropriate to call them large data models —maybe then they would be more easily and quickly understood by more people. Ultimately, our data is going into AI for potentially positive and transformational purposes but only if there is consent, understanding, trustworthiness and a real connection between the purpose to which the AI is being put and those of us whose data is being put into the AI.

I am going to focus on four areas: one is data adequacy, which has already, understandably, been heavily mentioned; then AI, smart data and digital ID. I can probably compress everything I was going to say on the first subject by simply asking my noble friend the Minister: how will the Bill assure adequacy between the UK and the EU? It is quite a large Bill— as other noble Lords have commented—yet it still has a number of gaps that I am sure we will all be keen to fully fill in when we return in 2024. As already mentioned, AI is nothing without data, so what checks are being put in place for many of the suggestions throughout the Bill where AI is used to interrogate individuals’ data? Would it not be absolutely appropriate for there to be effective, clear, transparent labelling across all AI uses, not least in the public sector but across all public and private sector uses? Saying this almost feels like going off track from the Bill into AI considerations, but it seems impossible to consider the Bill without seeing how it is inextricably linked to AI and the pro-innovation AI White Paper published earlier this year. Does the Minister not agree? How much line-by-line analysis has been done of the Bill to ensure that there is coherence across the Government’s ambitions for AI and what is currently set out in this Bill?

On smart data, there are clearly extraordinary opportunities but they are not inevitabilities. To consider just one sector, the energy sector, to be able potentially to deploy customers’ data in real time—through their smart meters, for example—with potential to auto-shift in real time to the cheapest tariff, could be extraordinarily positive. But again, that is only if there is an understanding of how the consent mechanisms will work and how each citizen is enabled to understand that it is their data. There are potentially huge opportunities, not least to do something significant about the poverty premium, where all too often those who find themselves with the least are forced to pay the most, often for essential services such as energy. What are the Government doing in terms of looking at additional sectors for smart data deployment? What areas are the state activities? What areas of previous state activity are being considered for the deployment of smart data? What stage is that analysis at?

On digital ID, about which I have spoken a lot over previous years, again there are huge opportunities and possibilities. I welcome what is in the Bill around the potential use of digital ID in property transactions. This could be an extraordinarily positive development. What other areas are being looked at for potential digital ID usage? What stage is that analysis at? Also, is what is set out in the Bill coherent with other government work in other departments on digital ID? It seems that a lot has been done and there have been a number of efforts from various Administrations on digital ID, but we are yet to realise the prize it could bring.

I will ask my noble friend some questions in conclusion. First, how will the introduction of the SRI improve things compared with the data protection officer? Again, how will that impact on issues such as, but not limited to, adequacy? Similarly, linking back to artificial intelligence, a key principle—though not foolproof by any measure and certainly not a silver bullet, but important none the less—is the human in the loop. The Bill is currently some way short of a clear, effective definition and exposition of how meaningful human intervention, human involvement and human oversight will work where autonomous systems are at play. What are the Government’s plans to address that significant gap in the Bill as currently drafted?

I end where I began, with the simple truth that it is our data. Data has been described in various terms, not least as the new oil, but that definition gets us nowhere. It is so much more profound than that. Ultimately it is part of us and, when it is put together in combination, it gets so close to giving such a detailed, personal and almost complete picture of us—ultimately the digital twin, if you will. Are the Government content that the Bill does everything to respect and fully understand the need for everything to be seen as trustworthy, to be understood in terms of it being our data and our decision, and that we decide what data to deploy, for what purpose, to whom and for what time period? It is our data.