AI Systems: Risks

Debate between Lord Leong and Lord Clement-Jones
Thursday 8th January 2026

(1 week, 1 day ago)

Grand Committee
Read Full debate Read Hansard Text
Lord Leong Portrait Lord in Waiting/Government Whip (Lord Leong) (Lab)
- Hansard - -

My Lords, I acknowledge the interest of the noble Lord, Lord Fairfax, in this area of artificial intelligence and congratulate him on securing this important, wide-ranging and timely debate. I thank all noble Lords who have contributed to it. I will use the time available to set out the Government’s position and respond to noble Lords’ questions. If I am unable to answer all of them because of time constraints, I will go through Hansard, write to noble Lords and place a copy in the Library.

Recent remarks by the director-general of MI5 highlight that advanced AI is now more than just a technical matter: it has become relevant to national security, the economy and public safety. The warning was not regarding panic or science fiction scenarios; it focused on responsibility. As AI systems grow more capable and more autonomous, we must ensure that they function on a scale and at a speed within human control.

The future prosperity of this country will be shaped by science, technology and AI. The noble Lord, Lord Goldsmith, is absolutely right that we have to look at the advantages that AI brings to society and to us all. That is why we have announced a new package of reforms and investment to use AI as a driver of national renewal. But we must be, and are, clear-eyed about this. As the noble Baroness, Lady Browning, mentioned, we cannot unlock the opportunities unless AI is safe for the public and businesses, and unless the UK retains real agency over how the technology is developed and deployed.

That is exactly the approach that this Government are taking. We are acting decisively to make sure that advanced AI remains safe and controllable. I give credit to the previous Government for establishing the world-leading AI Security Institute to deepen our scientific understanding of the risks posed by frontier AI systems. We are already taking action on emerging risks, including those linked to AI chatbots. The institute works closely with AI labs to improve the safety of their systems, and has now tested more than 30 frontier models.

That work is not academic. Findings from those tests are being used to strengthen real-world safeguards, including protections against cyber risks. Through close collaboration with industry, the national security community and our international partners, the Government have built a much deeper and more practical understanding of AI risks. We are also backing the institute’s alignment project, which will distribute up to £15 million to fund research to ensure that advanced AI systems remain controllable and reliable and follow human instructions, even as they become more powerful.

Several noble Lords mentioned the potential of artificial general intelligence and artificial superintelligence, as well as the risks that they could pose. There is considerable debate around the timelines for achieving both, and some experts believe that AGI could be reached by 2030. We cannot be sure how AI will develop and impact society over the next five—perhaps even less than that—10 or 20 years. Navigating this future will require evidence-based foresight to inform action, technical solutions and global co-ordination. Without a shared scientific understanding of these systems, we risk underreacting to threats or overcorrecting against innovation.

Through close collaboration with companies, the national security community and our international partners, the Government have deepened their understanding of such risks, and AI model security has improved as a result. The Government will continue to take a long-term, science-led approach to understand and prepare for risks emerging from AI. This includes preparing for the possibility of rapid AI progress, which could have transformative impacts on society and national security.

We are not complacent. Just this month, the Technology Secretary confirmed in Parliament that the Government will look at what more can be done to manage the risks posed by AI chatbots. She also urged Ofcom to use its existing powers to ensure that any chatbots in scope of the Online Safety Act are safe for children. Some noble Lords may be aware that today Ofcom has the power to impose sanctions on companies of up to 10% of their revenue or £18 million, whichever is greater.

Several noble Lords have touched on regulation, and I will just cover it now. We are clear that AI is a general-purpose technology, with uses across every sector of the economy. That is why we believe most AI should be regulated at the point of use. Existing frameworks already matter. Data protection and equality law protect people’s rights and prevent discrimination when AI systems are used to make decisions about jobs, credit or access to services.

We also know that regulators need to be equipped for what is coming. That is why we are working with them to strengthen their capabilities and ensure they are ready to deal with the challenges that AI presents.

Security does not stop at our borders. The UK is leading internationally and driving collaboration with allies to raise standards, share scientific insight and shape responsible global norms for frontier AI. We lead discussions on AI at the G7, the OECD and the United Nations, and we are strengthening bilateral partnerships, including our ongoing collaboration with India as we prepare for the AI Impact Summit in Delhi next month. I hope this provides assurance to the noble Viscount, Lord Camrose. The AI Security Institute will continue to play a central role globally, including leading the International Network for Advanced AI Measurement, Evaluation and Science, helping to set best practice for model testing and safety worldwide.

In an AI-enabled world, it matters who owns the infrastructure, builds the models and controls the data. That increasingly shapes our lives. That is why we have established a Sovereign AI Unit, backed by around £500 million, to support UK start-ups across the AI ecosystem and ensure that Britain has a stake at every layer of the AI stack.

Several noble Lords asked about our dependence on US companies. Our sovereignty goals should indeed be resilience and strategic advantage, not total self-reliance. We have to face the fact that US companies are the main providers of today’s frontier model capabilities. Our approach is to ensure that the UK can use the best models in the world while protecting UK interests. To achieve this, we have established strategic partnerships with leading frontier model developers, such as the memoranda of understanding with Anthropic, OpenAI and Cohere, to ensure resilient access and influence the development of such capabilities.

We are also investing in advanced, AI-based compute so that researchers can work on national priorities. We are creating AI growth zones across the country to unlock gigawatts of capacity by 2030. Through our advanced market commitment, we are helping promising UK firms to scale, win global business and deploy British chips alongside established providers.

We are pursuing AI safety with such determination, because it is what unlocks opportunity. The UK should not be an AI taker. Businesses and consumers need confidence that AI systems are safe and reliable and do what they are supposed to do. That is why our road map to trusted third-party AI assurance is so important. Trust is what turns safety into growth. It is what allows innovation to flourish.

In January we published the AI Opportunities Action Plan, setting out how we will seize the benefits of this technology for the country. We will train 7.5 million workers in essential AI skills by 2030, equip 1 million students with AI and digital skills, and support talented undergraduates and postgraduates through scholarships at leading STEM universities. I hope this will be welcomed by the noble Lord, Lord Clement-Jones.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, at that point, I will just interrupt the Minister before the witching hour. The Minister has reiterated the approach to focus governance on the user—that is, the sectoral approach—but is that not giving a free pass to general AI developers?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I will quickly respond. We have to be very careful about what level we regulate at. AI is multifaceted: at different stacks we have the infrastructure, the data level, the model level and so on. At which level are we going to regulate? We are trying to work with the community to find out what is best before we come up with a solution as far as regulation is concerned. AI is currently regulated at different user levels.

Let me continue. We are appointing AI champions to work with industry and government to drive adoption in high-growth sectors. Our new AI for Science Strategy will help accelerate breakthroughs that matter to people’s lives.

In summary, safe and controllable AI does not hinder progress; rather, it underpins it. It is integral to our goal of leveraging AI’s transformative power and securing the UK’s role as an AI innovator, not merely a user. Safety is not about stopping progress. It is about maintaining human control over advances. The true danger is not overthinking this now; it is failing to think enough.

Subscription Contracts: Right to Cancel

Debate between Lord Leong and Lord Clement-Jones
Tuesday 2nd December 2025

(1 month, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

The noble Lord will know that, under the Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013, consumers already have a cooling-off period for distance contracts, so this is not new for the sector. The digital content waiver is long established, and most charitable memberships are service contracts, not digital content. We consulted on extending the waiver, as that would reduce consumer rights. Having said that, gambling is excluded due to the existing specialist regulations. We recognise the concerns raised by charities and heritage organisations about potential misuse and will continue to work closely with charities as we finalise the secondary legislation.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Section 267 gives the Government the clear ability to use the regulation-making powers to recognise the specific circumstances of particular services, such as streaming, charitable memberships and, of course, the news media. Do the Government intend to make distinctions between those sectors? If so, will the Government make sure that streaming services, the charitable sector and, indeed, the news media are protected from early termination?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

The noble Lord makes an interesting point. Let us look at the policy. We are talking about unwanted subscriptions, which account for some £1.6 billion a year. This Act will save consumers some £14 a month, which is about £147 million a year. As it stands, charities have to comply with consumer law irrespective of charitable status. Companies, especially digital service organisations, have the legislation that is currently in place, so that will stay as it is. The cooling-off period under the new Act is just an extension from distance contracts to in person.

Space Debris

Debate between Lord Leong and Lord Clement-Jones
Tuesday 18th November 2025

(1 month, 4 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am glad that the Minister and the noble Viscount mentioned the most recent report of the Select Committee. Given what the Minister has said about active debris removal and the European Space Agency, what are the Government doing to ensure that the cost of end-of-life compliance is met by commercial satellite operators and not from the public purse?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

The noble Lord makes a very interesting point. The Government are currently funding innovation in debris mitigation and removal. We support the research and development of UKRI and Innovate UK and are funding private companies such as Astroscale and ClearSpace to carry out in-orbit servicing trials. As far as cleaning the outer and inner in-orbit debris is concerned, space is global and we have to work with our global partners in addressing this issue. Conversations are ongoing as to who will pay for it.

AI: Workforce Training

Debate between Lord Leong and Lord Clement-Jones
Monday 3rd November 2025

(2 months, 1 week ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

I thank the noble Viscount for that. At the end of the day, the fact is that AI is now central to the UK’s growth strategy. The results are very clear: UK AI companies deliver some £11.8 billion in gross value added, revenues are up 68% and over 86,000 people now work in the sector.

As for the question itself, the point here is that we need to address the skills gap. AI is already changing the way we work, and we need to support everyone in this country in adopting AI skills. We also need a plan to tackle market challenges and ensure that people right across the UK are ready for the future.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I declare an interest as a consultant at DLA Piper on AI policy and regulation. This year the Government have chosen to devolve responsibility for digital boot camps, which in previous years have helped thousands of participants develop new digital skills. There is a new technical funding guide, but what guarantee of funding for future years do providers and local authorities have, and what consistency of procurement is there? For instance, what core requirement is there for the essential AI training content to be carried? At the very minimum, it should include AI literacy and understanding and critical thinking skills.

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

The noble Lord made several points there; I will address the point about AI gaps in the workforce. The Government are actively assessing AI skills gaps and taking action to close them. My department regularly reviews the AI labour market and has commissioned new research, due to be released later this year. We are working with the Department for Education and Skills England to map pathways into AI roles. We recently announced a joint commitment with industry to upskill some 7.5 million workers.

Amazon Web Services

Debate between Lord Leong and Lord Clement-Jones
Tuesday 21st October 2025

(2 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

My Lords, in respect of the noble Viscount’s point about cost, this happened just yesterday so, of course, we are still working it through; it will take us some time to evaluate how much it will cost the economy. I am sure that economists will be kept very busy for some time working out the costs and the impact on productivity.

We are already taking steps to strengthen the resilience of the UK’s digital infrastructure. Through the national cyber strategy and the national resilience framework, we are working with the National Cyber Security Centre to treat major cloud service providers as part of our critical national infrastructure. This includes measures to ensure that they have robust redundancy back-up and incident response capabilities in place. At the same time, we are consulting with industry on enhanced incident reporting and transparency requirements so that the Government can be alerted immediately to any service disruption that could have national impact.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, at the very least, this should be a wake-up call for the Government. It is clear that the Government have been overdependent on two US cloud service providers, which, as the Competition and Markets Authority says, have 70% to 90% of the market, and restrictive practices impede competition. Of course, there is now a sovereign AI unit within DSIT. Will government procurement policy now change to encourage UK cloud service providers, which would then help to deliver sovereign AI? Will the Government also encourage the CMA to act rapidly, given this lack of competition?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

I thank the noble Lord for those points. The Government are aware and are taking cybersecurity seriously. That is why we have published a number of strategies and are working with the National Cyber Security Centre, as I mentioned earlier. The noble Lord also mentioned procurement and the service providers. The three providers I just mentioned—Amazon Web Services, Microsoft Azure and Google Cloud—probably have something like 60% of the market share. Yes, we have other small, independent providers as well but, at the same time, procurement is dependent on government departments: on how they want to procure their services and from where. The basic point is that, going forward, we have to ensure that it is safe and resilient.

Online Safety Act 2023: Virtual Private Networks

Debate between Lord Leong and Lord Clement-Jones
Monday 15th September 2025

(4 months ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

To ask His Majesty’s Government what assessment they have made of the increased use of virtual private networks since the implementation of age verification requirements for access to primary priority content under the Online Safety Act 2023.

Lord Leong Portrait Lord in Waiting/Government Whip (Lord Leong) (Lab)
- View Speech - Hansard - -

My Lords, the Government and Ofcom are monitoring the potential impact of circumvention techniques on the effectiveness of the Online Safety Act, especially since the child safety duties came into effect in July 2025. Services promoting VPN use to bypass age checks could face enforcement action. These duties represent a major milestone in protecting children online, making it harder for children to access harmful content. We must allow sufficient time for these measures to embed before considering further action.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, there are concerns and some misinformation circulating about VPNs and other aspects of the Act. In this light, is the Minister confident that the Act is still fit for purpose, and that platforms have a clear existing responsibility to prevent children bypassing safety protections? Does all this not mean that Parliament needs an early chance at post-legislative scrutiny of the implementation and operation of the Act to ensure, in particular, that it fulfils its aims of keeping users, particularly children, safe online while preserving free speech for adults?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

My Lords, the Online Safety Act places very clear duties on platforms to protect children, including tackling methods of circumvention. The use of VPNs to bypass safeguards is a known risk, and platforms must act decisively. They are already required to assess such risks and implement proportionate measures. Ofcom will hold platforms to account. The Act requires Ofcom to produce and publish a report assessing how effective the use of age assurance has been and whether there are factors that prevented or hindered the effective use of age assurance. These will be published by June 2026.

Jaguar Land Rover Cyberattack

Debate between Lord Leong and Lord Clement-Jones
Wednesday 10th September 2025

(4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

I thank the right reverend Prelate for that question. In 2024, the National Cyber Security Centre managed hundreds of incidents, 89 of which were nationally significant attacks. In 2025, the cybersecurity breaches survey shows that just less than half of businesses, about 43%, and around one-third of charities, about 30%, reported having experienced a cybersecurity breach or attack in the past 12 months. Cyberattacks do not happen just to big companies; they attack every company, all sizes and all types, and we have to be vigilant on that. The Government see the UK cybersecurity sector as a driving force in widening opportunities for our citizens. We have to ensure that this is protected. The Government have a plan and are working across departments putting a Bill together and we hope that parliamentary time will allow us to bring it forward.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I express my appreciation of the work of the noble Baroness, Lady Jones, which the Minister mentioned, and I wish her well in her non-ministerial capacity. Given reports that the attack has been claimed by hacker groups linked to Scattered Spider, which I believe is also responsible for recent attacks on UK retailers, including Marks & Spencer, what enhanced intelligence-sharing mechanisms are the Government establishing between business sectors to prevent co-ordinated attacks by the same threat actors?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I am sure that the noble Lord will appreciate that there is only so much I can say about what the Government are doing, but I assure him that the Government are speaking to businesses of all types through various business organisations. The National Cyber Security Centre is working with businesses. It has previously worked with M&S and the Co-op and is now working with JLR to provide support in relation to whatever incidents have happened, including the current incident. As I said, we cannot comment further on specifics at this stage, including with regard to potential perpetrators. The National Crime Agency has warned of a rise in teenage boys being drawn into online criminal communities and is co-ordinating responses to online harm networks across the United Kingdom.

Mathematical Sciences

Debate between Lord Leong and Lord Clement-Jones
Wednesday 2nd April 2025

(9 months, 2 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, in addition to the cuts mentioned by the noble Lord, Lord Waldegrave, the Government have withdrawn funding from the planned national academy for the mathematical sciences, but polling among employers for the Maths Horizons project found that maths skills are increasingly in demand. Do we not badly need a national strategy for maths, as the Campaign for Mathematical Sciences is calling for?

Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

I thank the noble Lord for the question. There is nothing to cancel. The national academy was devised by the previous Government, who allocated £6 million towards it when they were not properly funded. The money was not there in the first place and £6 million was a meagre figure, whereas we are spending more and more money on other learned societies. It is as if I want to buy a £5 million penthouse around the corner, and I go to the estate agent and say, “I would like to buy it but I don’t have money allocated for it”. There is nothing to cancel.

Product Security and Telecommunications Infrastructure (Security Requirements for Relevant Connectable Products) (Amendment) Regulations 2024

Debate between Lord Leong and Lord Clement-Jones
Monday 10th February 2025

(11 months ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I thank the noble Lords, Lord Clement-Jones and Lord Sharpe, for their contributions.

I will first address the question asked by the noble Lord, Lord Clement-Jones: why the delay? As the noble Lord, Lord Sharpe, mentioned, it was a result of the general election. At the same time, we were waiting for the Department for Transport to progress UN regulation No. 155, until such time as we knew that we must take this exception out of the current regulations. That is the reason for the delay, basically; it was also about finding parliamentary time to table these regulations. That is that on the delay.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

I am sorry to interrupt the Minister but, frankly, this is the same instrument as the one that was debated last May. Nothing has changed apart from the lack of parliamentary time. We could have done this in September, October or whenever. I forget quite when we had the King’s Speech—in July? We could have done this at any time in the past few months.

--- Later in debate ---
Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

This is beyond my pay grade, I am afraid. I will need to ask my leader, the Chief Whip, why we could not allocate any parliamentary time for this legislation.

As far as personal data is concerned, the GDPR is still the lead legislation. I respectfully say to the noble Lord that, for the purposes of today’s regulations, the whole issue of such data is outside the scope of this instrument for now. However, I am sure that we will be talking about personal data in the months and, probably, years to come in other forms of legislation, or even about it being regulated itself.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

Out of scope? On the basis that we are being asked to exempt automated vehicles, is it not proper that we ask for reassurance about automated vehicles and the implications for safety, data or whatever else? We are exempting them from these connected product regulations, so we need to be reassured that there are other ways of regulating them other than through these regulations. So this is not out of scope; the debate is about whether we should be exempting them.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

I take the point, but the instrument is about the two amendments to the regulations. I take the noble Lord’s point about data. Yes, it is important, and we must preserve the data, but this instrument is not within that scope.

Moving on to cybersecurity within autonomous vehicles, cybersecurity is at the heart of the Government’s priorities for the rollout of all self-driving vehicles. The Automated Vehicles Act 2024 enables an obligation to be placed on those responsible for self-driving vehicles to maintain a vehicle’s software and ensure that appropriate cybersecurity measures are in place throughout its service life.

In response to the point made by the noble Lord, Lord Sharpe, about innovation, the Government are committed to supporting the development and deployment of self-driving vehicles in the UK. Our permissive trialling regime means that self-driving cars, buses and freight vehicles are already on UK roads with safety drivers. The Automated Vehicles Act will pave the way to scale deployments beyond trials. The Act delivers one of the most comprehensive legal frameworks of its kind anywhere in the world for self-driving vehicles, with safety at its core. It sets out clear legal responsibilities, establishes a safety framework and creates the necessary powers to regulate this new industry.

On the point about cybersecurity from the noble Lord, Lord Clement-Jones, the Government take national security extremely seriously and are actively monitoring threats to the UK. The Department for Transport works closely with the transport sector, the National Cyber Security Centre and other government departments to understand and respond to cybersecurity issues associated with connected vehicles. UN regulation No. 155 more comprehensively addresses cybersecurity risks with automotive vehicles and has adequate provisions to deal with the prospect of self-driving vehicles. The PSTI regime is designed for consumer contactable devices or products and is not fully equipped to address the specific needs and complexities of vehicle cybersecurity. UN regulation No. 155, which was developed through international collaboration, provides a more suitable and rigorous framework for ensuring the security of vehicles.

More everyday products than ever are now connected to the internet. The Government have taken action to ensure that UK consumers and businesses purchasing consumer connectable products are better protected from the risks of cyberattack, fraud, or even, in the most serious cases, physical danger. The PSTI product security regulatory regime builds on the ETSI international standard and is the first of its kind in the world to come into force.

The cybersecurity regulatory landscape will continue to evolve. The Government need to be agile to ensure that there is synergy between existing and new laws. Through this draft instrument, the Government are delivering on the commitment in 2021 to except certain categories of automotive vehicles from the scope of the PSTI products security regulatory regime. This is because the Government, via the Department for Transport, are in the process of introducing sector-specific regulations that have been developed at an international level to address the cybersecurity of these products. These requirements, which are specifically tailored to these vehicles and their functionality, will create a more precise regime for the sector. This draft instrument therefore ensures that the automotive industry, which contributed £13.3 billion to the economy in 2022, will not be placed under undue burdens from dual regulations.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the Minister has not mentioned the point raised in the Explanatory Memorandum, which was designed, I think, to give us comfort about cybersecurity and data: the Government’s Connected and Automated Vehicles: Process for Assuring Safety and Security—CAVPASS—which I mentioned. I did not hear him give us an assurance that that will be developed during 2025 to ensure the safety and cybersecurity of self-driving vehicles. As well as reiterating that the GDPR is an absolutely splendid way of regulating these automated vehicles, I hope that he will reiterate that this will be produced, because I have had a look at what CAVPASS currently says in the area of data, and it is not very much. After all, these connected regulations from which we are exempting automated vehicles are about safety, data and everything else.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, the noble Lord makes a very important point. Rather than waiting for my officials to give me a briefing note, I will ensure that I write to him on all the points that he has just mentioned.

Public Authority Algorithmic and Automated Decision-Making Systems Bill [HL]

Debate between Lord Leong and Lord Clement-Jones
Lord Leong Portrait Lord in Waiting/Government Whip (Lord Leong) (Lab)
- View Speech - Hansard - -

My Lords, I thank the noble Lord, Lord Clement-Jones, for bringing the important issue of public sector algorithmic transparency for debate, both today and through the Data (Use and Access) Bill, and I thank the noble Earl, Lord Effingham, for his contribution.

The algorithmic transparency recording standard, or ATRS, is now mandatory for government departments. It is focused, first, on the 16 largest departments, including HMRC; some 85 ALBs; and local authorities. It has also now been endorsed by the Welsh Government. While visible progress on enforcing this mandate was slow for some time, new records are now being added to the online repository at pace. The first batch of 14 was added in December and a second batch of 10 was added just last week. I am assured that many more will follow shortly.

The blueprint for modern digital government, as mentioned by the noble Lord, Lord Clement-Jones, was published on 21 January, promising explicitly to commit to transparency and accountability by building on the ATRS. The blueprint also makes it clear that part of the new Government Digital Service role will be to offer specialist assurance support, including a service to rigorously test models and products before release.

The Government share the desire of the noble Lord, Lord Clement-Jones, to see algorithmic tools used in the public sector safely and transparently, and they are taking active steps to ensure that that happens. I hope that reassures the noble Lord, and I look forward to continuing to engage with him on this important issue.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I thank the noble Earl for taking the trouble to read my Bill quite carefully. I shall obviously dispute various aspects of it with him in due course; however, I welcome the fact that he has taken the trouble to look at its provisions. I thank the Minister for his careful reply. I do not think that the Government are going far enough, but time will tell.

Data (Use and Access) Bill [HL]

Debate between Lord Leong and Lord Clement-Jones
Lord Leong Portrait Lord Leong (Lab)
- View Speech - Hansard - -

My understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- View Speech - Hansard - - - Excerpts

Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

I thank the noble Lord for that request, and I am sure my officials would be willing to do that.

Data (Use and Access) Bill [HL]

Debate between Lord Leong and Lord Clement-Jones
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

That is a sensible suggestion.

Data Protection and Digital Information Bill

Debate between Lord Leong and Lord Clement-Jones
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, having been involved in and seen the campaigning of the bereaved families and the noble Baroness, Lady Kidron, in particular in the Joint Committee on the Draft Online Safety Bill onwards, I associate myself entirely with the noble Baroness’s statement and with my noble friend Lord Allan’s remarks.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I thank the Minister for setting out the amendment and all noble Lords who spoke. I am sure the Minister will be pleased to hear that we support his Amendment 236 and his Amendment 237, to which the noble Baroness, Lady Kidron, has added her name.

Amendment 236 is a technical amendment. It seeks the straightforward deletion of words from a clause, accounting for the fact that investigations by a coroner, or procurator fiscal in Scotland, must start upon them being notified of the death of a child. The words

“or are due to conduct an investigation”

are indeed superfluous.

We also support Amendment 237. The deletion of this part of the clause would bring into effect a material change. It would empower Ofcom to issue a notice to an internet service provider to retain information in all cases of a child’s death, not just cases of suspected suicide. Sadly, as many of us have discovered in the course of our work on this Bill, there is an increasing number of ways in which communication online can be directly or indirectly linked to a child’s death. These include areas of material that is appropriate for adults only; the inability to filter harmful information, which may adversely affect mental health and decision-making; and, of course, the deliberate targeting of children by adults and, in some cases, by other children.

There are adults who use the internet with the intention of doing harm to children through coercion, grooming or abuse. What initially starts online can lead to contact in person. Often, this will lead to a criminal investigation, but, even if it does not, the changes proposed by this amendment could help prevent additional tragic deaths of children, not just those caused by suspected child suicides. If the investigating authorities have access to online communications that may have been a contributing factor in a child’s death, additional areas of concern can be identified by organisations and individuals with responsibility for children’s welfare and action taken to save many other young lives.

Before I sit down, I want to take this opportunity to say a big thank you to the noble Baroness, Lady Kidron, the noble Lord, Lord Kennedy, and all those who have campaigned on this issue relentlessly and brought it to our attention.

--- Later in debate ---
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I will speak to all the amendments in this group, other than Amendment 295 from the noble Baroness, Lady Jones. Without stealing her thunder, I very much support it, especially in an election year and in the light of the deepfakes we have already seen in the political arena—those of Sadiq Khan, those used in the Slovakian election and the audio deepfakes of the President of the US and Sir Keir Starmer. This is a real issue and I am delighted that she has put down this amendment, which I have signed.

In another part of the forest, the recent spread of deepfake photos purporting to show Taylor Swift engaged in explicit acts has brought new attention to the use, which has been growing in recent years, of deepfake images, video and audio to harass women and commit fraud. Women constitute 99% of the victims and the most visited deepfake site had 111 million users in October 2023. More recently, children have been found using “declothing” apps, which I think the noble Baroness mentioned, to create explicit deepfakes of other children.

Deepfakes also present a growing threat to elections and democracy, as I have mentioned, and the problems are increasingly rampant. Deepfake fraud rates rose by 3,000% globally in 2023, and it is hardly surprising that, in recent polling, 86% of the UK population supported a ban on deepfakes. I believe that the public are demanding an urgent solution to this problem. The only effective way to stop deepfakes, which is analogous to what the noble Baroness, Lady Kidron, has been so passionately advocating, is for the Government to ban them at every stage, from production to distribution. Legal liability must hold to account those who produce deepfake technology, create and enable deepfake content, and facilitate its spread.

Existing legislation seeks to limit the spread of images on social media, but this is not enough. The recent images of Taylor Swift were removed from X and Telegram, but not before one picture had been viewed more than 47 million times. Digital watermarks are not a solution, as shown by a paper by world-leading Al researchers released in 2023, which concluded that

“strong and robust watermarking is impossible to achieve”.

Without measures across the supply chain to prevent the creation of deepfakes, the law will forever be playing catch-up.

The Government now intend to ban the creation of sexual imagery deepfakes; I welcome this and have their announcement in my hand:

“Government cracks down on ‘deepfakes’ creation”.


This will send a clear message that the creation of these intimate images is not acceptable. However, this appears to cover only sexual image deepfakes. These are the most prevalent form of deepfakes, but other forms of deepfakes are also causing noticeable and rapidly growing harms, most obviously political deepfakes—as the noble Baroness, Lady Jones, will illustrate—and deepfakes used for fraud. This also appears to cover only the endpoint of the creation of deepfakes, not the supply chain leading up to that point. There are whole apps and companies dedicated to the creation of deepfakes, and they should not exist. There are industries which provide legitimate services—generative Al and cloud computing—which fail to take adequate measures and end up enabling creation of deepfakes. They should take measures or face legal accountability.

The Government’s new measures are intended to be introduced through an amendment to the Criminal Justice Bill, which is, I believe, currently between Committee and Report in the House of Commons. As I understand it, however, there is no date scheduled yet for Report, as the Bill seems to be caught in a battle over amendments.

The law will, however, be extremely difficult to enforce. Perpetrators are able to hide behind anonymity and are often difficult to identify, even when victims or authorities are aware that deepfakes have been created. The only reliable and effective countermeasure is to hold the whole supply chain responsible for deepfake creation and proliferation. All parties involved in the AI supply chain, from AI model developers and providers to cloud compute providers, must demonstrate that they have taken steps to preclude the creation of deepfakes. This approach is similar to how society combats—or, rather, analogous to the way that I hope the Minister will concede to the noble Baroness, Lady Kidron, society will combat—child abuse material and malware.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I speak to Amendments 293 and 294 from the noble Lord, Lord Clement-Jones, Amendment 295 proposed by my noble friend Lady Jones and Amendments 295A to 295F, also in the name of the noble Lord, Lord Clement-Jones.

Those noble Lords who are avid followers of my social media feeds will know that I am an advocate of technology. Advanced computing power and artificial intelligence offer enormous opportunities, which are not all that bad. However, the intentions of those who use them can be malign or criminal, and the speed of technological developments is outpacing legislators around the world. We are constantly in danger of creating laws that close the stable door long after the virtual horse has bolted.

The remarkable progress of visual and audio technology has its roots in the entertainment industry. It has been used to complete or reshoot scenes in films in the event of actors being unavailable, or in some cases, when actors died before filming was completed. It has also enabled filmmakers to introduce characters, or younger versions of iconic heroes for sequels or prequels in movie franchises. This enabled us to see a resurrected Sir Alec Guinness and a younger version of Luke Skywalker, or a de-aged Indiana Jones, on our screens.

The technology that can do this is only around 15 years old, and until about five years ago it required extremely powerful computers, expensive resources and advanced technical expertise. The first malicious use of deepfakes occurred when famous actors and celebrities, mainly and usually women, had their faces superimposed on to bodies of participants in pornographic videos. These were then marketed online as Hollywood stars’ sex tapes or similar, making money for the producers while causing enormous distress to the women targeted. More powerful computer processors inevitably mean that what was once very expensive rapidly becomes much cheaper very quickly. An additional factor has turbo-boosted this issue: generative AI. Computers can now learn to create images, sound and video movement almost independently of software specialists. It is no longer just famous women who are the targets of sexually explicit deepfakes; it could be anyone.

Amendment 293 directly addresses this horrendous practice, and I hope that there will be widespread support for it. In an increasingly digital world, we spend more time in front of our screens, getting information and entertainment on our phones, laptops, iPads and smart TVs. What was once an expensive technology, used to titillate, entertain or for comedic purposes, has developed an altogether darker presence, well beyond the reach of most legislation.

In additional to explicit sexual images, deepfakes are known to have been used to embarrass individuals, misrepresent public figures, enable fraud, manipulate public opinion and influence democratic political elections and referendums. This damages people individually: those whose images or voices are faked, and those who are taken in by the deepfakes. Trusted public figures, celebrities or spokespeople face reputational and financial damage when their voices or images are used to endorse fake products or for harvesting data. Those who are encouraged to click through are at risk of losing money to fraudsters, being targeted for scams, or having their personal and financial data leaked or sold on. There is growing evidence that information used under false pretences can be used for profiling in co-ordinated misinformation campaigns, for darker financial purposes or political exploitation.

In passing, it is worth remembering that deepfakes are not always images of people. Last year, crudely generated fake images of an explosion, purported to be at the Pentagon, caused the Dow Jones industrial average to drop 85 points within four minutes of the image being published, and triggered emergency response procedures from local law enforcement before it was debunked 20 minutes later. The power of a single image, carefully placed and virally spreading, shows the enormous and rapid economic damage that deepfakes can create.

Amendment 294 would make it an offence for a person to generate a deepfake for the purpose of committing fraud, and Amendment 295 would make it an offence to create deepfakes of political figures, particularly when they risk undermining electoral integrity. We support all the additional provisions in this group of amendments; Amendments 295A to 295F outline the requirements, duties and definitions necessary to ensure that those creating deepfakes can be prosecuted.

I bring to your Lordships’ attention the wording of Amendment 295, which, as well as making it an offence to create a deepfake, goes a little further. It also makes it an offence to send a communication which has been created by artificial intelligence and which is intended to create the impression that a political figure has said or done something that is not based in fact. This touches on what I believe to be a much more alarming aspect of deepfakes: the manner in which false information is distributed.

Digital Markets, Competition and Consumers Bill

Debate between Lord Leong and Lord Clement-Jones
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, it is a pleasure to follow the noble Lord, Lord Lucas, with all his experience as a fund manager, and particularly to hear what he forecast for the future: the ability of AI to deliver information in a new format that is of much greater interest and use to a consumer. I must admit, I had not really thought about that.

It is also a pleasure to follow my noble friend Lady Sheehan, and in particular to support the noble Baroness, Lady Wheatcroft, on her amendment. We are obviously saving the best for last in contributing to our final group in Committee. As a former company secretary, I well remember the noble Baroness as a financial journalist and an absolute champion of corporate governance. This appears to be an absolutely crucial part of it. In a sense, it is the other side of the coin from what you expect of the corporate; it is what you expect of those who invest in the corporate, in terms of exercising their voting rights. The noble Baroness illustrated the sorry history of the voluntary approach put forward by the FCA. I could loosely describe her amendment as trying to put some lead in the FCA’s pencil, which seems wholly needed.

The noble Baroness asked a number of further questions. A really interesting and important question is: how on earth can the US, with its relatively unregulated systems compared to ours and its culture of not regulating on a federal basis, do it on a compulsory basis when we have not? Particularly from what the noble Lord, Lord Lucas, said, it sounds as though it will be eminently possible to do this, as the technology improves, without overly imposing costs on investment managers. Indeed, it is already being done for those operating in the states. There seems absolutely no reason why the Government should not move forward in the way that the noble Baroness suggests.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I thank the noble Baroness, Lady Wheatcroft, for tabling Amendment 212, and I thank all noble Lords who have spoken. I will be brief.

In 2019, the European Union introduced the second shareholder rights directive, which sets out stipulations regarding the utilisation of specific shareholder privileges linked to voting shares during general meetings of companies that are headquartered in a member state and have their shares traded on a regulated market located or functioning within a member state. It was brought into UK law by secondary legislation, amending the occupational pension schemes regulations of 2005, and it has now been assimilated into UK law. As per the Explanatory Notes to the regulations, they encourage investors to be transparent about how they invest and approach their engagement as shareholders. It was a negative statutory instrument, so no debates were tabled.

The amendment of the noble Baroness, Lady Wheatcroft, carries greater weight than the shareholder rights directive. It would mandate the FCA to establish regulations necessitating investment managers and life insurers to furnish standardised reports concerning company voting activities upon request. Furthermore, it would instruct the FCA to offer guidance to firms on the specific format for such reporting.

We agree in principle with the amendment that it is right for shareholders to be more transparent. The noble Baroness, Lady Sheehan, mentioned being transparent about where investments are made, which we need to know if we are to achieve net zero. This was fully supported by the noble Lord, Lord Lucas. Fund managers need to be more transparent about informing where their funds are invested.

I ask the Minister: what impact has there been on investor transparency in the four and a half years that the SRD has been in UK law? I look forward to his response.

Digital Markets, Competition and Consumers Bill

Debate between Lord Leong and Lord Clement-Jones
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, once again, with the indulgence of the Committee, I will speak on behalf of my noble friend Lady Bakewell to Amendments 125, 126 and 127.

Before doing so, I say that I support the amendments of the noble Lord, Lord Lucas, which strike me as extremely practical. It must be extremely frustrating when faced with some of the restrictions. This point about vehicles seems to me a particular irritant for trading standards officers—a vehicle being defined as premises. What era are we living in?

We need to bring the powers of trading standards officers up to the 21st century, which is very much the spirit in which Amendments 125, 126 and 127 have been tabled by the noble Earl, Lord Lindsay, my noble friend Lady Bakewell and the noble Baroness, Lady Crawley. Amendment 125 would delete paragraph 17 of Schedule 5 to the Consumer Rights Act, which at present requires trading standards officers to exercise physical powers of entry to premises—this is in the digital age—before accessing information and the seizing of documents that may be needed in criminal proceedings. Accepting this amendment would be an opportunity to finally update the powers of trading standards in this respect. It would have the effect of changing their information-gathering powers to enable documents requested in writing without the need for physical entry to be used in criminal proceedings. This means also relieving the undue burdens placed on businesses and trading standards officers.

For legitimate businesses there is presently the burden of having to interrupt their normal business to provide the requested documents there and then, whereas, under what is proposed in this amendment, if the request is made in writing rather than physically, they will have more time to source the required documents and even seek legal advice should they wish to. For the small band of trading standards officers, the requirement to exercise physical powers of entry across the country to seize documents they may need to use in criminal proceedings is not cost-effective for their cash-strapped local authorities. If a local authority in, say, my noble friend’s Somerset had to deal with a case in Cumbria, it would simply not be viable for this to happen. The criminal activity could go unpunished and the public and consumer would still be at risk from rogue-trader activity.

In the impact assessment for the Bill, it is accepted that:

“Consumer rights must keep pace with market innovations, so that consumers remain confident engaging with businesses offering new products and services”.


That is a good statement, but for this sort of consumer confidence to become more robust, the enforcement powers of trading standards need to be seriously updated and not inhibited by the present inflexibility.

Amendments 126 and 127 propose to substitute the words “England or Wales” and “Scotland” for the words “United Kingdom” in paragraph 44(3) and 44(2) of Schedule 5 to the Consumer Rights Act. The effect of these amendments would be to add a new paragraph to Schedule 16 to the Bill, which would give new powers to trading standards officers to operate across UK national borders where necessary. Cross-border activities should be included in the Bill; current legislation does not make it clear that trading standards officers in England and Wales can exercise their powers across the border with Scotland, or vice versa, even though consumer protection is a reserved power. In fact, the current legislation implies that this cross-border enforcement activity is not permitted, and we are told that, currently, trading standards officers err on the side of caution. Who can blame them in the circumstances? For the success of these new powers and the Bill to take root, trading standards officers should be able to pursue and enforce across the whole of the United Kingdom.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I thank all noble Lords who have spoken. We are grateful to the noble Lord, Lord Lucas, the noble Earl, Lord Lindsay, the noble Baroness, Lady Bakewell, and my noble friend Lady Crawley for bringing forward this group of amendments relating to Schedule 16, which is introduced by Chapter 6, Clause 207. They seek to amend Schedule 5 to the Consumer Rights Act 2015.

Amendments 124A and 124B appear to add clarity without altering the intention of the Bill as written. Having said that, we would be interested to hear from the Minister whether there is any reason these changes should not be enacted.

Amendment 124C would make a more substantial change to financial penalties. The current level 3 is no deterrent or obstruction. A mere £1,000 is just petty cash for most businesses, whereas level 5, which is an unlimited fine, would serve as a deterrent and perhaps support some co-operation in investigation. We would like to hear from the Minister whether there has been any assessment of the suitability of obstruction being a level 3 fine since the Consumer Rights Act came into law in 2015. We also seek clarification on whether this is the right place to make such a change, given that its impact would be much wider.

Amendments 125, 126 and 127, tabled by the noble Earl, Lord Lindsay, with the support of my noble friend Lady Crawley and the noble Baroness, Lady Bakewell, make a lot of sense in pursuing investigations in all parts of the United Kingdom, not just England and Wales. That was succinctly explained by the noble Lord, Lord Clement-Jones, so I shall not repeat the point. This would obviously be a matter for the Scottish Government. If the Government agree on the merits, is this something they have discussed with their Scottish counterparts?

The amendments in this group are sensible and designed to be helpful. They should be supported. We look forward to the Minister’s response.

Digital Markets, Competition and Consumers Bill

Debate between Lord Leong and Lord Clement-Jones
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, this is quite a set of amendments and the Minister rather rattled through his speech, but I have only one question: why are they now being included in the Bill here in Committee? Why were they not in the original version of the Bill? What is the motivation behind these new amendments? I am always a little suspicious. With the data protection Bill coming down the track, we will have hours of endless excitement. The words “data protection” and “government” are sometimes a bit of a red rag, so one always has to kick the tyres quite hard on any provision that appears to be opening a door to disclosure of data and so on. Obviously, in a competition context, it is most likely to be commercial confidential information, but the Minister needs to explain what kind of information we are talking about and why we need to have these provisions included at this stage.

Lord Leong Portrait Lord Leong (Lab)
- Hansard - -

My Lords, I thank the Minister for his overview and explanation of the various government amendments. I look forward to his response to the question from the noble Lord, Lord Clement-Jones: why now? These are mainly technical and tidying-up amendments and we are in broad agreement with most of them in this group.

Amendment 217 makes it clear that any imposed or conferred duties to process information do not contravene data protection legislation. That is welcome. Amendment 213 ensures the disclosure of information under Chapter 2 of Part 5 of the Bill, which allows UK regulators to provide investigative assistance to overseas regulators. This is in line with the restrictions on the disclosure of certain kinds of information found in the Enterprise Act 2002, which is fine. I ask the Minister what assessments are in place to safeguard the sharing of such details with autocratic regimes, which may not have robust governance and accountability systems in place and whose values we do not share? On Amendment 218, I ask the Minister whether the intent is similar to that of Amendment 1, as set out so eloquently by my noble friend Lady Jones of Whitchurch on the first day of Committee?

Finally, I refer to Amendment 216, which replaces the definition of data protection legislation for the whole of the Bill, so the definition in Amendments 73 and 208 are removed. Can the Minister confirm that such a definition is consistent with Article 8 of the European Convention on Human Rights and the Enterprise Act 2002? I look forward to the Minister’s response and comments.