My Lords, I acknowledge the interest of the noble Lord, Lord Fairfax, in this area of artificial intelligence and congratulate him on securing this important, wide-ranging and timely debate. I thank all noble Lords who have contributed to it. I will use the time available to set out the Government’s position and respond to noble Lords’ questions. If I am unable to answer all of them because of time constraints, I will go through Hansard, write to noble Lords and place a copy in the Library.
Recent remarks by the director-general of MI5 highlight that advanced AI is now more than just a technical matter: it has become relevant to national security, the economy and public safety. The warning was not regarding panic or science fiction scenarios; it focused on responsibility. As AI systems grow more capable and more autonomous, we must ensure that they function on a scale and at a speed within human control.
The future prosperity of this country will be shaped by science, technology and AI. The noble Lord, Lord Goldsmith, is absolutely right that we have to look at the advantages that AI brings to society and to us all. That is why we have announced a new package of reforms and investment to use AI as a driver of national renewal. But we must be, and are, clear-eyed about this. As the noble Baroness, Lady Browning, mentioned, we cannot unlock the opportunities unless AI is safe for the public and businesses, and unless the UK retains real agency over how the technology is developed and deployed.
That is exactly the approach that this Government are taking. We are acting decisively to make sure that advanced AI remains safe and controllable. I give credit to the previous Government for establishing the world-leading AI Security Institute to deepen our scientific understanding of the risks posed by frontier AI systems. We are already taking action on emerging risks, including those linked to AI chatbots. The institute works closely with AI labs to improve the safety of their systems, and has now tested more than 30 frontier models.
That work is not academic. Findings from those tests are being used to strengthen real-world safeguards, including protections against cyber risks. Through close collaboration with industry, the national security community and our international partners, the Government have built a much deeper and more practical understanding of AI risks. We are also backing the institute’s alignment project, which will distribute up to £15 million to fund research to ensure that advanced AI systems remain controllable and reliable and follow human instructions, even as they become more powerful.
Several noble Lords mentioned the potential of artificial general intelligence and artificial superintelligence, as well as the risks that they could pose. There is considerable debate around the timelines for achieving both, and some experts believe that AGI could be reached by 2030. We cannot be sure how AI will develop and impact society over the next five—perhaps even less than that—10 or 20 years. Navigating this future will require evidence-based foresight to inform action, technical solutions and global co-ordination. Without a shared scientific understanding of these systems, we risk underreacting to threats or overcorrecting against innovation.
Through close collaboration with companies, the national security community and our international partners, the Government have deepened their understanding of such risks, and AI model security has improved as a result. The Government will continue to take a long-term, science-led approach to understand and prepare for risks emerging from AI. This includes preparing for the possibility of rapid AI progress, which could have transformative impacts on society and national security.
We are not complacent. Just this month, the Technology Secretary confirmed in Parliament that the Government will look at what more can be done to manage the risks posed by AI chatbots. She also urged Ofcom to use its existing powers to ensure that any chatbots in scope of the Online Safety Act are safe for children. Some noble Lords may be aware that today Ofcom has the power to impose sanctions on companies of up to 10% of their revenue or £18 million, whichever is greater.
Several noble Lords have touched on regulation, and I will just cover it now. We are clear that AI is a general-purpose technology, with uses across every sector of the economy. That is why we believe most AI should be regulated at the point of use. Existing frameworks already matter. Data protection and equality law protect people’s rights and prevent discrimination when AI systems are used to make decisions about jobs, credit or access to services.
We also know that regulators need to be equipped for what is coming. That is why we are working with them to strengthen their capabilities and ensure they are ready to deal with the challenges that AI presents.
Security does not stop at our borders. The UK is leading internationally and driving collaboration with allies to raise standards, share scientific insight and shape responsible global norms for frontier AI. We lead discussions on AI at the G7, the OECD and the United Nations, and we are strengthening bilateral partnerships, including our ongoing collaboration with India as we prepare for the AI Impact Summit in Delhi next month. I hope this provides assurance to the noble Viscount, Lord Camrose. The AI Security Institute will continue to play a central role globally, including leading the International Network for Advanced AI Measurement, Evaluation and Science, helping to set best practice for model testing and safety worldwide.
In an AI-enabled world, it matters who owns the infrastructure, builds the models and controls the data. That increasingly shapes our lives. That is why we have established a Sovereign AI Unit, backed by around £500 million, to support UK start-ups across the AI ecosystem and ensure that Britain has a stake at every layer of the AI stack.
Several noble Lords asked about our dependence on US companies. Our sovereignty goals should indeed be resilience and strategic advantage, not total self-reliance. We have to face the fact that US companies are the main providers of today’s frontier model capabilities. Our approach is to ensure that the UK can use the best models in the world while protecting UK interests. To achieve this, we have established strategic partnerships with leading frontier model developers, such as the memoranda of understanding with Anthropic, OpenAI and Cohere, to ensure resilient access and influence the development of such capabilities.
We are also investing in advanced, AI-based compute so that researchers can work on national priorities. We are creating AI growth zones across the country to unlock gigawatts of capacity by 2030. Through our advanced market commitment, we are helping promising UK firms to scale, win global business and deploy British chips alongside established providers.
We are pursuing AI safety with such determination, because it is what unlocks opportunity. The UK should not be an AI taker. Businesses and consumers need confidence that AI systems are safe and reliable and do what they are supposed to do. That is why our road map to trusted third-party AI assurance is so important. Trust is what turns safety into growth. It is what allows innovation to flourish.
In January we published the AI Opportunities Action Plan, setting out how we will seize the benefits of this technology for the country. We will train 7.5 million workers in essential AI skills by 2030, equip 1 million students with AI and digital skills, and support talented undergraduates and postgraduates through scholarships at leading STEM universities. I hope this will be welcomed by the noble Lord, Lord Clement-Jones.
My Lords, at that point, I will just interrupt the Minister before the witching hour. The Minister has reiterated the approach to focus governance on the user—that is, the sectoral approach—but is that not giving a free pass to general AI developers?
My Lords, I will quickly respond. We have to be very careful about what level we regulate at. AI is multifaceted: at different stacks we have the infrastructure, the data level, the model level and so on. At which level are we going to regulate? We are trying to work with the community to find out what is best before we come up with a solution as far as regulation is concerned. AI is currently regulated at different user levels.
Let me continue. We are appointing AI champions to work with industry and government to drive adoption in high-growth sectors. Our new AI for Science Strategy will help accelerate breakthroughs that matter to people’s lives.
In summary, safe and controllable AI does not hinder progress; rather, it underpins it. It is integral to our goal of leveraging AI’s transformative power and securing the UK’s role as an AI innovator, not merely a user. Safety is not about stopping progress. It is about maintaining human control over advances. The true danger is not overthinking this now; it is failing to think enough.
The noble Lord will know that, under the Consumer Contracts (Information, Cancellation and Additional Charges) Regulations 2013, consumers already have a cooling-off period for distance contracts, so this is not new for the sector. The digital content waiver is long established, and most charitable memberships are service contracts, not digital content. We consulted on extending the waiver, as that would reduce consumer rights. Having said that, gambling is excluded due to the existing specialist regulations. We recognise the concerns raised by charities and heritage organisations about potential misuse and will continue to work closely with charities as we finalise the secondary legislation.
My Lords, Section 267 gives the Government the clear ability to use the regulation-making powers to recognise the specific circumstances of particular services, such as streaming, charitable memberships and, of course, the news media. Do the Government intend to make distinctions between those sectors? If so, will the Government make sure that streaming services, the charitable sector and, indeed, the news media are protected from early termination?
The noble Lord makes an interesting point. Let us look at the policy. We are talking about unwanted subscriptions, which account for some £1.6 billion a year. This Act will save consumers some £14 a month, which is about £147 million a year. As it stands, charities have to comply with consumer law irrespective of charitable status. Companies, especially digital service organisations, have the legislation that is currently in place, so that will stay as it is. The cooling-off period under the new Act is just an extension from distance contracts to in person.
My Lords, I am glad that the Minister and the noble Viscount mentioned the most recent report of the Select Committee. Given what the Minister has said about active debris removal and the European Space Agency, what are the Government doing to ensure that the cost of end-of-life compliance is met by commercial satellite operators and not from the public purse?
The noble Lord makes a very interesting point. The Government are currently funding innovation in debris mitigation and removal. We support the research and development of UKRI and Innovate UK and are funding private companies such as Astroscale and ClearSpace to carry out in-orbit servicing trials. As far as cleaning the outer and inner in-orbit debris is concerned, space is global and we have to work with our global partners in addressing this issue. Conversations are ongoing as to who will pay for it.
I thank the noble Viscount for that. At the end of the day, the fact is that AI is now central to the UK’s growth strategy. The results are very clear: UK AI companies deliver some £11.8 billion in gross value added, revenues are up 68% and over 86,000 people now work in the sector.
As for the question itself, the point here is that we need to address the skills gap. AI is already changing the way we work, and we need to support everyone in this country in adopting AI skills. We also need a plan to tackle market challenges and ensure that people right across the UK are ready for the future.
My Lords, I declare an interest as a consultant at DLA Piper on AI policy and regulation. This year the Government have chosen to devolve responsibility for digital boot camps, which in previous years have helped thousands of participants develop new digital skills. There is a new technical funding guide, but what guarantee of funding for future years do providers and local authorities have, and what consistency of procurement is there? For instance, what core requirement is there for the essential AI training content to be carried? At the very minimum, it should include AI literacy and understanding and critical thinking skills.
The noble Lord made several points there; I will address the point about AI gaps in the workforce. The Government are actively assessing AI skills gaps and taking action to close them. My department regularly reviews the AI labour market and has commissioned new research, due to be released later this year. We are working with the Department for Education and Skills England to map pathways into AI roles. We recently announced a joint commitment with industry to upskill some 7.5 million workers.
(2 months, 3 weeks ago)
Lords ChamberMy Lords, in respect of the noble Viscount’s point about cost, this happened just yesterday so, of course, we are still working it through; it will take us some time to evaluate how much it will cost the economy. I am sure that economists will be kept very busy for some time working out the costs and the impact on productivity.
We are already taking steps to strengthen the resilience of the UK’s digital infrastructure. Through the national cyber strategy and the national resilience framework, we are working with the National Cyber Security Centre to treat major cloud service providers as part of our critical national infrastructure. This includes measures to ensure that they have robust redundancy back-up and incident response capabilities in place. At the same time, we are consulting with industry on enhanced incident reporting and transparency requirements so that the Government can be alerted immediately to any service disruption that could have national impact.
My Lords, at the very least, this should be a wake-up call for the Government. It is clear that the Government have been overdependent on two US cloud service providers, which, as the Competition and Markets Authority says, have 70% to 90% of the market, and restrictive practices impede competition. Of course, there is now a sovereign AI unit within DSIT. Will government procurement policy now change to encourage UK cloud service providers, which would then help to deliver sovereign AI? Will the Government also encourage the CMA to act rapidly, given this lack of competition?
I thank the noble Lord for those points. The Government are aware and are taking cybersecurity seriously. That is why we have published a number of strategies and are working with the National Cyber Security Centre, as I mentioned earlier. The noble Lord also mentioned procurement and the service providers. The three providers I just mentioned—Amazon Web Services, Microsoft Azure and Google Cloud—probably have something like 60% of the market share. Yes, we have other small, independent providers as well but, at the same time, procurement is dependent on government departments: on how they want to procure their services and from where. The basic point is that, going forward, we have to ensure that it is safe and resilient.
To ask His Majesty’s Government what assessment they have made of the increased use of virtual private networks since the implementation of age verification requirements for access to primary priority content under the Online Safety Act 2023.
My Lords, the Government and Ofcom are monitoring the potential impact of circumvention techniques on the effectiveness of the Online Safety Act, especially since the child safety duties came into effect in July 2025. Services promoting VPN use to bypass age checks could face enforcement action. These duties represent a major milestone in protecting children online, making it harder for children to access harmful content. We must allow sufficient time for these measures to embed before considering further action.
My Lords, there are concerns and some misinformation circulating about VPNs and other aspects of the Act. In this light, is the Minister confident that the Act is still fit for purpose, and that platforms have a clear existing responsibility to prevent children bypassing safety protections? Does all this not mean that Parliament needs an early chance at post-legislative scrutiny of the implementation and operation of the Act to ensure, in particular, that it fulfils its aims of keeping users, particularly children, safe online while preserving free speech for adults?
My Lords, the Online Safety Act places very clear duties on platforms to protect children, including tackling methods of circumvention. The use of VPNs to bypass safeguards is a known risk, and platforms must act decisively. They are already required to assess such risks and implement proportionate measures. Ofcom will hold platforms to account. The Act requires Ofcom to produce and publish a report assessing how effective the use of age assurance has been and whether there are factors that prevented or hindered the effective use of age assurance. These will be published by June 2026.
(4 months ago)
Lords ChamberI thank the right reverend Prelate for that question. In 2024, the National Cyber Security Centre managed hundreds of incidents, 89 of which were nationally significant attacks. In 2025, the cybersecurity breaches survey shows that just less than half of businesses, about 43%, and around one-third of charities, about 30%, reported having experienced a cybersecurity breach or attack in the past 12 months. Cyberattacks do not happen just to big companies; they attack every company, all sizes and all types, and we have to be vigilant on that. The Government see the UK cybersecurity sector as a driving force in widening opportunities for our citizens. We have to ensure that this is protected. The Government have a plan and are working across departments putting a Bill together and we hope that parliamentary time will allow us to bring it forward.
My Lords, I express my appreciation of the work of the noble Baroness, Lady Jones, which the Minister mentioned, and I wish her well in her non-ministerial capacity. Given reports that the attack has been claimed by hacker groups linked to Scattered Spider, which I believe is also responsible for recent attacks on UK retailers, including Marks & Spencer, what enhanced intelligence-sharing mechanisms are the Government establishing between business sectors to prevent co-ordinated attacks by the same threat actors?
My Lords, I am sure that the noble Lord will appreciate that there is only so much I can say about what the Government are doing, but I assure him that the Government are speaking to businesses of all types through various business organisations. The National Cyber Security Centre is working with businesses. It has previously worked with M&S and the Co-op and is now working with JLR to provide support in relation to whatever incidents have happened, including the current incident. As I said, we cannot comment further on specifics at this stage, including with regard to potential perpetrators. The National Crime Agency has warned of a rise in teenage boys being drawn into online criminal communities and is co-ordinating responses to online harm networks across the United Kingdom.
My Lords, in addition to the cuts mentioned by the noble Lord, Lord Waldegrave, the Government have withdrawn funding from the planned national academy for the mathematical sciences, but polling among employers for the Maths Horizons project found that maths skills are increasingly in demand. Do we not badly need a national strategy for maths, as the Campaign for Mathematical Sciences is calling for?
I thank the noble Lord for the question. There is nothing to cancel. The national academy was devised by the previous Government, who allocated £6 million towards it when they were not properly funded. The money was not there in the first place and £6 million was a meagre figure, whereas we are spending more and more money on other learned societies. It is as if I want to buy a £5 million penthouse around the corner, and I go to the estate agent and say, “I would like to buy it but I don’t have money allocated for it”. There is nothing to cancel.
(11 months ago)
Grand CommitteeMy Lords, I thank the noble Lords, Lord Clement-Jones and Lord Sharpe, for their contributions.
I will first address the question asked by the noble Lord, Lord Clement-Jones: why the delay? As the noble Lord, Lord Sharpe, mentioned, it was a result of the general election. At the same time, we were waiting for the Department for Transport to progress UN regulation No. 155, until such time as we knew that we must take this exception out of the current regulations. That is the reason for the delay, basically; it was also about finding parliamentary time to table these regulations. That is that on the delay.
I am sorry to interrupt the Minister but, frankly, this is the same instrument as the one that was debated last May. Nothing has changed apart from the lack of parliamentary time. We could have done this in September, October or whenever. I forget quite when we had the King’s Speech—in July? We could have done this at any time in the past few months.
This is beyond my pay grade, I am afraid. I will need to ask my leader, the Chief Whip, why we could not allocate any parliamentary time for this legislation.
As far as personal data is concerned, the GDPR is still the lead legislation. I respectfully say to the noble Lord that, for the purposes of today’s regulations, the whole issue of such data is outside the scope of this instrument for now. However, I am sure that we will be talking about personal data in the months and, probably, years to come in other forms of legislation, or even about it being regulated itself.
Out of scope? On the basis that we are being asked to exempt automated vehicles, is it not proper that we ask for reassurance about automated vehicles and the implications for safety, data or whatever else? We are exempting them from these connected product regulations, so we need to be reassured that there are other ways of regulating them other than through these regulations. So this is not out of scope; the debate is about whether we should be exempting them.
I take the point, but the instrument is about the two amendments to the regulations. I take the noble Lord’s point about data. Yes, it is important, and we must preserve the data, but this instrument is not within that scope.
Moving on to cybersecurity within autonomous vehicles, cybersecurity is at the heart of the Government’s priorities for the rollout of all self-driving vehicles. The Automated Vehicles Act 2024 enables an obligation to be placed on those responsible for self-driving vehicles to maintain a vehicle’s software and ensure that appropriate cybersecurity measures are in place throughout its service life.
In response to the point made by the noble Lord, Lord Sharpe, about innovation, the Government are committed to supporting the development and deployment of self-driving vehicles in the UK. Our permissive trialling regime means that self-driving cars, buses and freight vehicles are already on UK roads with safety drivers. The Automated Vehicles Act will pave the way to scale deployments beyond trials. The Act delivers one of the most comprehensive legal frameworks of its kind anywhere in the world for self-driving vehicles, with safety at its core. It sets out clear legal responsibilities, establishes a safety framework and creates the necessary powers to regulate this new industry.
On the point about cybersecurity from the noble Lord, Lord Clement-Jones, the Government take national security extremely seriously and are actively monitoring threats to the UK. The Department for Transport works closely with the transport sector, the National Cyber Security Centre and other government departments to understand and respond to cybersecurity issues associated with connected vehicles. UN regulation No. 155 more comprehensively addresses cybersecurity risks with automotive vehicles and has adequate provisions to deal with the prospect of self-driving vehicles. The PSTI regime is designed for consumer contactable devices or products and is not fully equipped to address the specific needs and complexities of vehicle cybersecurity. UN regulation No. 155, which was developed through international collaboration, provides a more suitable and rigorous framework for ensuring the security of vehicles.
More everyday products than ever are now connected to the internet. The Government have taken action to ensure that UK consumers and businesses purchasing consumer connectable products are better protected from the risks of cyberattack, fraud, or even, in the most serious cases, physical danger. The PSTI product security regulatory regime builds on the ETSI international standard and is the first of its kind in the world to come into force.
The cybersecurity regulatory landscape will continue to evolve. The Government need to be agile to ensure that there is synergy between existing and new laws. Through this draft instrument, the Government are delivering on the commitment in 2021 to except certain categories of automotive vehicles from the scope of the PSTI products security regulatory regime. This is because the Government, via the Department for Transport, are in the process of introducing sector-specific regulations that have been developed at an international level to address the cybersecurity of these products. These requirements, which are specifically tailored to these vehicles and their functionality, will create a more precise regime for the sector. This draft instrument therefore ensures that the automotive industry, which contributed £13.3 billion to the economy in 2022, will not be placed under undue burdens from dual regulations.
My Lords, the Minister has not mentioned the point raised in the Explanatory Memorandum, which was designed, I think, to give us comfort about cybersecurity and data: the Government’s Connected and Automated Vehicles: Process for Assuring Safety and Security—CAVPASS—which I mentioned. I did not hear him give us an assurance that that will be developed during 2025 to ensure the safety and cybersecurity of self-driving vehicles. As well as reiterating that the GDPR is an absolutely splendid way of regulating these automated vehicles, I hope that he will reiterate that this will be produced, because I have had a look at what CAVPASS currently says in the area of data, and it is not very much. After all, these connected regulations from which we are exempting automated vehicles are about safety, data and everything else.
My Lords, the noble Lord makes a very important point. Rather than waiting for my officials to give me a briefing note, I will ensure that I write to him on all the points that he has just mentioned.
(11 months, 1 week ago)
Lords ChamberMy Lords, I thank the noble Lord, Lord Clement-Jones, for bringing the important issue of public sector algorithmic transparency for debate, both today and through the Data (Use and Access) Bill, and I thank the noble Earl, Lord Effingham, for his contribution.
The algorithmic transparency recording standard, or ATRS, is now mandatory for government departments. It is focused, first, on the 16 largest departments, including HMRC; some 85 ALBs; and local authorities. It has also now been endorsed by the Welsh Government. While visible progress on enforcing this mandate was slow for some time, new records are now being added to the online repository at pace. The first batch of 14 was added in December and a second batch of 10 was added just last week. I am assured that many more will follow shortly.
The blueprint for modern digital government, as mentioned by the noble Lord, Lord Clement-Jones, was published on 21 January, promising explicitly to commit to transparency and accountability by building on the ATRS. The blueprint also makes it clear that part of the new Government Digital Service role will be to offer specialist assurance support, including a service to rigorously test models and products before release.
The Government share the desire of the noble Lord, Lord Clement-Jones, to see algorithmic tools used in the public sector safely and transparently, and they are taking active steps to ensure that that happens. I hope that reassures the noble Lord, and I look forward to continuing to engage with him on this important issue.
My Lords, I thank the noble Earl for taking the trouble to read my Bill quite carefully. I shall obviously dispute various aspects of it with him in due course; however, I welcome the fact that he has taken the trouble to look at its provisions. I thank the Minister for his careful reply. I do not think that the Government are going far enough, but time will tell.
(11 months, 3 weeks ago)
Lords ChamberMy understanding is that “customer” reflects an individual, but I am sure that the Minister will give a better explanation at the meeting with officials next week.
Again before the Minister sits down—I am sure he will not be able to sit down for long—would he open that invitation to a slightly wider group?
I thank the noble Lord for that request, and I am sure my officials would be willing to do that.
(1 year, 1 month ago)
Grand CommitteeMy Lords, the problem is that I have a 10-minute speech and there are five minutes left before Hansard leaves us, so is it sensible to draw stumps at this point? I have not counted how many amendments I have, but I also wish to speak to the amendment by the noble and learned Lord, Lord Thomas. I would have thought it sensible to break at this point.