(1 week, 4 days ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
I beg to move,
That this House has considered technology sovereignty.
It is a pleasure to serve under your chairmanship, Ms Vaz. We are four years into the Ukraine war and 10 days into the latest Iran-Israel-US conflict. At the start of this year, the US seized the President of Venezuela. A few weeks later, President Trump was demanding Greenland from Denmark. The world has never felt more insecure and unsecure.
For the first time since I was elected as an MP, global insecurity is an issue on the doorstep in Newcastle. As if that were not enough, we are also undergoing two technology revolutions: one in data and the other in AI automation. Add to that the geopolitical restructuring across different dimensions—Europe and the US, the global south and Russia/China, Europe and Russia, and Iran and the Gulf states—and a green industrial revolution that is driving competition for knowledge, resources, land and people. Is it any wonder that people are feeling insecure?
In the face of those challenges, we must be honest with our constituents about what we can and cannot control, and about the implications for our industrial, civil and defence policy. Technology sovereignty is a key part of that and a placeholder for larger fears. Too often, people feel that big tech is controlling, not empowering, their lives. Techno-feudalism and techno-serfdom may not be commonly discussed in the pubs and playgrounds of Newcastle, but they are a fear that many have.
The previous Secretary of State for Science, Innovation and Technology, my right hon. Friend the Member for Hove and Portslade (Peter Kyle), said that big tech needs to be treated as a state, not as companies. If so, who are their citizens? Us? We certainly did not elect them, so are we just their serfs? What should the relationship be between those companies and states?
Technology sovereignty matters, but what is it? The current Secretary of State for Science, Innovation and Technology told the Science, Innovation and Technology Committee:
“Sovereign capability is about ensuring the UK has what it needs to become a global leader in AI.”
The Digital Minister, my right hon. Friend the Member for Edinburgh South (Ian Murray), told the Committee:
“Sovereignty is a huge issue that we always discuss. Security, safety and resilience are all parts of that, and the digital spending controls that DSIT puts in on behalf of Government, which examines individual contracts on that basis, very much examines these issues as well.”
He also said:
“It is about building those capabilities and supply chains here.”
Will Stone (Swindon North) (Lab)
My hon. Friend has advanced a very powerful vision of the global events affecting the country right now. When I talk to defence tech companies, I see that they reach the point of scaling up, but they are unable to access finance. Does my hon. Friend agree that this Government should support defence tech companies to scale up, so that we can have true sovereign capability, as opposed to letting them fly off to America?
I very much agree with my hon. Friend. He is absolutely right, and that support should take the form of access to investment, but also procurement and procurement decisions, which I will discuss in more detail.
The Digital Minister also told the Committee:
“There is no single internationally recognised definition of digital sovereignty”
and:
“DSIT is working to develop a comprehensive definition that can be used across the UK”.
We have not received an update, but yesterday, the Government launched the AI sovereignty unit with £500 million, so it is to be hoped that we know what we are spending our money on.
The hon. Lady is terribly kind and it is always a pleasure to come to a debate that she has secured. Recent studies indicate that AI-powered tools have already been used in phishing, ransomware, and social engineering attacks, making breaches faster, more targeted and harder to detect. The National Cyber Security Centre has repeatedly warned that the sophistication and scale of cyber-threats are increasing, and that AI could amplify those risks exponentially. Does the hon. Lady therefore agree that we have a critical gap in investment, expertise and the co-ordinated strategy in the United Kingdom of Great Britain and Northern Ireland to defend against AI-enabled attacks? The Government must focus on being able to combat those in future—does she agree?
I certainly agree that we need to be able to defend ourselves against AI attacks.
Martin Wrigley (Newton Abbot) (LD)
I thank the hon. Lady and Chair of my Select Committee for giving way. Does she agree that a definition of sovereign tech is something that a foreign power could not switch off, so that the systems on which we rely could not be pulled out from under our feet, much as the Microsoft ones were for the International Criminal Court?
My fellow member of the Science, Innovation and Technology Committee makes a very important point about the definition of sovereignty. I do not want to get too bogged down in the actual definition, but I agree that control matters, and I will say a little more about that.
I will raise the definition of digital sovereignty cited in the House of Commons Library briefing, which accompanies this debate, which is
“the agency and capacity of any organisation to make intelligent, informed choices to shape its digital future by design.”
On that basis, choosing between Amazon Web Services and Microsoft for our data centre is technology sovereignty. I also think that if British sovereignty depends on our leaders’ ability to make intelligent choices, they spent a lot of our history not having sovereignty.
The Library definition came from a global consultancy called Public Digital. Emily Middleton, the interim director for digital transformation in DSIT, was previously a partner at Public Digital. It rules out digital independence and says that our goal should be intelligent dependence. Can the Minister say whether he is aiming for intelligent dependence?
The definition I like best, however, is that sovereignty is whatever a sovereign power says it is—that is what sovereignty means. The UK has extraordinary technological human capital resources, particularly in AI, where we are probably third in the world, but also in clean energy, quantum synthetic biology and much more. Our human capital means that we are not just any mid-sized country; we can aim higher than intelligent dependence. Elon Musk chose to turn off Ukraine’s Starlink capacity at a critical time in Ukraine’s defence of its sovereignty against Putin’s illegal aggression. None of us wants the UK to be in such a position of dependence.
Dr Al Pinkerton (Surrey Heath) (LD)
The hon. Lady mentions Britain’s extraordinary human capital. In my role as my party’s Europe spokesperson, of late I have been speaking to very large international defence firms, which thrive in the UK intellectual environment. They have great links with universities, but they say to me that they are increasingly looking to move some of the start-ups that have been created in the UK into Europe, so that they can assemble rapidly the kinds of teams that they need to take those initial ideas and scale them up. Does she agree that having a closer working relationship with our European partners and colleagues, allowing that freedom of movement to return, could be an enormous benefit—counterintuitively perhaps—to our sovereign capacity?
The level of interest shows just what an important issue this is. I will come on to discuss some aspects of collaboration as it relates to sovereignty, but I observe that the last time our sovereignty as a mid-sized power was seriously debated was during Brexit, and the slogan “Take back control” reflected the sense that too much sovereignty had been ceded to the European Union without an honest debate with the British people. As a member of the Labour party, I know that we are stronger together and that that can require some loss of autonomy to deliver results, which actually make people more secure, but that must not be done without an honest debate.
Let us look at the four specific sovereignty challenges, the first of which is critical infrastructure and cloud data dependency. The Competition and Markets Authority found that cloud services in Britain are dominated by AWS at 40% to 50%, and Microsoft at 30%. Crown Hosting is meant to be our sovereign hosting capability, but it only hosts 4% of Government legacy services. Both Amazon Web Services and Oracle claim to offer a sovereign cloud—they do say to deal with the difficult part in the title!
The second issue I want to look at is the hot topic of AI. There is no Brit large language model but there is the ambition to transform our public services and industry through AI. The AI opportunities action plan repeatedly references sovereign AI and sovereign compute without defining them. The major AI companies Google, Anthropic, OpenEye, Microsoft and DeepSeek are all headquartered abroad. DeepMind formed Google’s AI capability and was founded right here in the UK before being bought. What capability does the UK now have in AI? What minimum capability does the Minister think we need? How do we respond to the EU Cloud and AI Development Act, which may exclude UK companies?
My hon. Friend is making an important point. When it comes to AI, an enormous amount of investment is needed. There are many discussions at the moment about the impact of that huge investment in AI. It is very difficult for a smaller country such as the UK to compete in that regard. Does she agree that we need to work with like-minded countries on these issues, including those in the EU? Does she agree that we need to make sure that this is one of the key topics when President Macron visits the UK later this year?
I agree with my right hon. Friend that we certainly need to work with like-minded countries.
The third area is cyber-security and data governance. Some argue that we are already at war in the cyber-sphere. Last year’s strategic defence review emphasised cyber and electromagnetic domains, and established a new UK cyber and electromagnetic command to enhance that, with £1 billion in new funding for homeland air missile defence and cyber-security initiatives. Should these be British suppliers? Should they be European? Should they be exclusively NATO suppliers?
On data governance, the foreign direct product rule allows the United States to restrict access to advanced computing chips and AI-related software. By adding UK companies to the entity list, the US can immediately cut them off from cloud services, software and AI tools, while the Cloud and Patriot Acts expand data access powers to compel US companies to hand over data even if held overseas—that is, in the UK. Has the Minister discussed those powers with Microsoft, AWS and Palantir?
Fourthly and finally, we have the UK’s reliance on global supply chains. Critical minerals are an obvious example, but because I am a bit of a geek I want to mention the common information models that enable the things in the internet of things to talk to each other. By 2030, there will be 6 billion CIM connections globally. China controls 70% of the market, creating a huge possibility for the disruption of everything from traffic systems to energy grid operations.
That is a really quick canter through just a few of the technology sovereignty issues. I want to look at two specific examples in more detail. First, the NHS has the largest and most comprehensive longitudinal and structured patient level datasets in the world. I support the push for digital integration as we transition the NHS from analogue to digital, with interoperability and standardisation bringing faster access and better analytics, yet a growing share of NHS data flows through US companies.
The federated data platform contract places core NHS data operations on Palantir’s proprietary systems. Why? There have been numerous reports of irregularities in the way the contract was awarded. In addition—this, for me, is a key point of sovereignty—Palantir’s founder and controlling stakeholder, Peter Thiel, has a political worldview which is at odds with British values. The same is true of Elon Musk. It does our constituents’ sense of agency no good to see their Government so dependent on these companies. Nearly half of adults say that they would opt out of NHS data sharing if the platform was operated by a private foreign provider.
The second example is also to do with Palantir. Its recent defence contract also raised many questions. The strategic defence review emphasised AI as a core enabler of military capability. Reports suggest that Palantir serves primarily as a vehicle for integrating Anthropic’s AI models. The US has just declared Anthropic a supply chain risk for US companies, so will Palantir break UK workflows that are using Anthropic? I am certain that President Trump would not allow British companies to control US defence datasets, so why are we allowing American ones to control ours?
I could go on about civil nuclear, telecoms infrastructure, subsea cables, quantum, space and drones, but I will stop there, and finish by looking at possible solutions. Technology sovereignty was a big theme at the Munich security conference, and the US-Europe trust gap was a yawning chasm following the shock realisation that we could not always count on the US as an ally. Technology sovereignty solutions that focus on technological leadership, such as in the Secretary of State’s definition, reflect the basic idea that if the UK leads on, say, protein folding then Google may be less inclined to switch off ChatGPT if we side with Denmark when the US tries to seize Greenland.
Whether I agree with that approach or not, it certainly resonates with the evidence that the Committee heard from witnesses in so many domains regarding how important it is for the science and business community to understand where the Government are seeking to lead, so that resources can be focused and skills built there. Can the Minister say whether the Government plan to decide which aspects of AI, quantum, space or bioengineering we will seek to lead in? AI is often thought of as having three layers: infrastructure, data and applications. Can the Minister tell us where in the AI stack we are aiming for control, leadership, sovereignty or whatever we want to call it? Also, does he agree that weak competition in the AI and digital sectors, caused by giant incumbents, reduces our ability to lead?
Open source is often cited as at least part of the solution to sovereignty. I am a huge advocate for open source, open interfaces, transparent code and standard protocols, which can reduce or minimise dependence. Despite the policy ambitions, three quarters of NHS trusts’ development teams do not use open source approaches. None of the AI models currently being deployed within the public sector is an open ecosystem; all are proprietary in nature. The Minister’s Department has sign-off on all significant IT procurement. Is open source a requirement of it?
Finally, can science diplomacy help us to negotiate technology sovereignty? A number of Members have raised the issue of collaboration. Can we build on our human capital strengths by collaborating and working with partners who have respect for our values, take collaborative approaches, and can share with us the financial capital needed to make our sovereign objectives a reality? Are we happy to share leadership, and perhaps sovereignty, with our allies?
Gordon McKee (Glasgow South) (Lab)
My hon. Friend is making an important speech on an important topic. She is right to talk about how the US and China dominate on technological sovereignty, and part of the reason it is very difficult for the UK to compete with them is, of course, the scale of those countries. Does she agree that the way we can compete is by co-operating with reform in Europe, and that we should view our strategy not in terms of how the UK can outcompete Europe but in terms of how Europe, with the UK at its heart, can outcompete the US and China?
It is an important question. I am not in a position to choose our allies, but I agree in principle that we should be working with the European Union. I do not think it should be a choice between the European Union and the US, though they may make that the choice. I certainly think that we should be working with our European allies in order to form a large market for secure and ethical technology, which is in the interests of everyone.
Finally, we need to monitor the future sovereignty implications of current research, so that that can influence our investment and mergers and acquisitions policy, and so that key technologies and companies are not easily allowed to go abroad.
This debate has attracted a large amount of interest, so I have tried to be as brief as possible. I have asked the Minister many questions; if he cannot answer them all, he can write to me. In summary, we need to understand what we can own, control or lead on ourselves, what we can access that is in the hands of allies we trust, and how we can manage the things we must get from those we do not trust. We must always remember that how we develop and deploy our human capital will be critical to our ability to achieve any kind of technological sovereignty. I urge the Minister to be honest about where we are. We do not want to sleepwalk into technological serfdom and/or some kind of techxit—a technology Brexit.
Several hon. Members rose—
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
It is such a pleasure to serve under you in the Chair, Ms Vaz. I thank my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), the Chair of the Science, Innovation and Technology Committee, for securing this debate and bringing to it her deep expertise across engineering, policymaking and leadership in the House on the question of tech sovereignty. I also thank all hon. Members for making very thoughtful points and bringing to the debate a range of experiences—as well as swiftness of speech, given the constraints imposed by time today.
I have long felt that the central question in our politics and for our country is the future of technology in this country. It will be the major driver of prosperity and dignity for people, and the central question is whether Britain gets to shape it or is shaped by it. In Westminster, we sometimes talk about technology sovereignty as an abstract geopolitical goal, but we have to keep in mind that, ultimately, it is the basis for our NHS radiologists to have access to the best tools for detecting cancer, with data here in the UK; for British founders and builders to be able to train and deploy models, rather than depending on foreign APIs and pricing; and for people in their homes and workplaces across the country to know that their everyday AI systems are governed transparently and democratically here in the UK.
My view is that technology sovereignty is a state’s ability to have strategic leverage when it comes to a technology, such that it can ensure ongoing access to critical inputs and ongoing assurance that its wider economic and national security objectives can be met more broadly. It is to take the best tools the world has to offer today, but also to shape the rest, and ultimately to make that which is critical here in Britain.
As I think of it, that strategic leverage is obtained by three steps on a ladder. The first is just to have enough of the critical inputs. Taking AI as an example, we have to have enough chips today to be able to do anything with AI in the first instance. With that in mind, the Government have always been very keen to secure the level of capital investment that means that Britain is at least at the table with critical inputs.
Once we are at the table, the second part of sovereignty is to make sure that we have some diversification in who we procure critical inputs from so that we can bargain effectively. We are the party of labour; we understand that who has power matters as much as what the powers are. In that context, one of the first things I did in my role was to engage with a series of companies in every part of the stack so that we were able to build more diversity into the landscape.
The third rung of the ladder is, ultimately, to build British in order to make sure that we have the full-fat version of sovereign capability here in critical parts of the stack.
I thank the Minister for setting out his sovereignty stack. Just as an example, is an LLM a critical input or another level in the stack—and does it need to be British?
Kanishka Narayan
I valued my hon. Friend’s earlier point that sovereignty has to be seen in the round. We cannot make everything here; we have to look at the entire bundle that we have to offer. In the context of LLMs, there is some uncertainty as to whether all the capability will ultimately accrue in closed proprietary models, or whether open-source, open-weight models might be part of it. To me, as things stand today, it is a pretty important part of the stack. The question then is whether we have enough of it to be able to make the most of it by adopting it for economic and national security usage here, or whether there are aspects in which, at least from a distillation or small-model point of view, we need to develop some capabilities here as well. I do not think there is a binary answer to the overarching question; the answer is much more nuanced. I am happy to discuss that further if it is of interest.
As I said, the third rung of the ladder is, ultimately, to build British and focus on areas in which we can develop our strengths. I have to point out that we made sure that Nscale, one of our neocloud hyperscale providers, was an important part of the supply chain for AI growth zones. I noticed that yesterday Nscale raised the largest ever series-C funding in Europe, in part as a result of the Government’s support and convening in that context. Arm, the leading chip design company globally, is still headquartered in Cambridge, and we have fantastic companies in the AI inference chip part of the stack, Fractile and Olix being two of them. It is an area that I spend a lot of my time on.
When it comes to models, we have huge strengths, not just because a number of the Gemini teams and researchers continue to sit in King’s Cross at DeepMind, but because companies developing foundation models in AI for science and autonomous vehicles, embodied AI, and aspects of world models and computer vision reside here in the UK. Wayve raised £1.5 billion just this year, the largest funding round in Europe to date for that stage. It is a fantastic company that looks in particular at embodied AI and vision. I am proud of those companies. It is right that the Government are supporting them through the lens of tech sovereignty, as that is what both Britain’s and the companies’ best interests dictate.
The sovereign AI unit will be crucial to that. I am glad to see the level of interest in that across the House. It will concentrate efforts on priority areas. There was interest in my specifying those areas. The four areas that are of interest at the outset are novel compute, in particular focusing on the inference chip part of the stack; novel model architecture; AI for science—I point hon. Members to the AI for science strategy published by the Department three or four months ago, which set out particular areas of focus and priority—and embodied AI.
To give a concrete example of early action that the sovereign AI unit has taken, we have already invested £8 million in the OpenBind consortium to accelerate AI-driven drug discovery, and £5 million in the Encode: AI for Science fellowship to support the next generation of world-class talent. The focus of the unit will be on both capital and compute, to incrementally anchor more and more British companies here, but I know that the unit will only be part of the solution. We have a role to look at innovation and market support much more broadly across the tech landscape.
In November, we also announced a significant advance market commitment—a deeply innovative procurement shift—which meant that up to £100 million in Government funding was available to buy products from promising UK chip companies once they reach a high-performance benchmark. That presents UK start-ups with an exciting opportunity to grow and compete right here, building for the world.
AI is of course just one area of Britain’s flourishing tech ecosystem. I point out to my hon. Friends the Members for Milton Keynes Central (Emily Darlington) and for Lichfield (Dave Robertson), who made important points about quantum, that the Government have doubled the rate of investment in quantum, with about £1 billion committed over the next four years. The points on helium made by my hon. Friend the Member for Lichfield have very much been taken into account. The Government are looking at the developing situation on helium supply in the middle east, which is of concern.
Through our national programme, we broadly want to anchor development and access to technological capabilities that are most important to economic growth and national security. That means, in the context of quantum, more companies starting, growing and staying here and, in the context of AI, not just developing capabilities in particular parts of the stack, but in part looking upstream for skills as well.
In that context, I agree totally with my hon. Friends the Members for Cambridge (Daniel Zeichner) and for Southend East and Rochford (Mr Alaba) that the quality and scale of our talent and skills in our universities and schools is the single biggest determinant of where we end up. I am happy to write to my hon. Friend the Member for Cambridge about the UKRI changes that we are making. In answer to my hon. Friend the Member for Southend East and Rochford, IP capitalisation is a deeply important part of what I focus on with the Intellectual Property Office, and I am happy to engage him on the question of Essex University in particular.
(3 weeks, 4 days ago)
Commons ChamberI thank the hon. Member for his point of order. The motion on the Order Paper is perfectly orderly, so Members will be invited to vote on that, not on the substance of any Bill that might come on 9 March. I think it is important that the House is clear on that.
Further to that point of order, Madam Deputy Speaker. How can I assess what is orderly for my contribution to the debate given that the substance of the motion is about process? To be frank, I do not want to speak about process; I want to speak about protections for children.
The motion is to give consideration to a Bill on the specific matter which has been outlined clearly on the Order Paper: “Protections for children from online harms”. I reassure the hon. Lady that any contribution she chooses to make on that matter would be in order.
I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action.
I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.
The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows.
As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform, which shall remain nameless, around 2005. The company executive commented that they had come to the UK from silicon valley on a six-month contract to sort out Government affairs, and they could not understand why, two years later, discussions were still ongoing. Did we not realise that Government had no role in what they did?
I say that to illustrate that tech platforms have their origins in a libertarian, small/no-government tech bro bubble that has spread globally. TikTok, as a Chinese company, has a different background, but public accountability is not necessarily part of it. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media and life online has exploded. That is why I consider the Tory position in this debate to be a superb example of hypocrisy.
Monica Harding (Esher and Walton) (LD)
The hon. Lady is making a powerful speech about the evolution of social media platforms. I have four children; the first was born in 2004 and the last was born in 2011, so their births have spanned that evolution. Facebook began in 2004; TikTok began in 2016. If that evolution was the industrial revolution, we would be around the spinning jenny stage, with AI chatbots the next destination. Those chatbots are terribly dangerous for our children, and we need to regulate them now. That should be within the Online Safety Act.
I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.
My hon. Friend the Chair of the Select Committee is making an excellent speech. Her background in this area is really showing in the detail with which she is exploring these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face. For me, part of the importance of the consultation is to allow parents to think more deeply about this difficult issue; there are often different opinions from campaigners who have had the most painful experiences.
My hon. Friend makes an excellent point. It is for that exact reason that I support a consultation: this is part of a debate, and we all need to improve our understanding of the impacts of this technology. Parents are in a difficult position. I do not believe parents should have to be technology experts in order to give their children the best start in life, but unfortunately there is so much pressure in the online world that that seems to be the case right now, and that is why it is right that Government take action and consult on the action they take.
Let us think about the evolution of these technologies. I remember that when I joined Facebook in 2005 I had to use my university email address to join—that meant I had to be over 18. Some 20 years later, 13-year-olds and younger are having their lives and brains formed by almost uninhibited access to social media. In the UK, the number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter Molly took her own life at the age of 14 following exposure to self-harm content online; I have spoken to the bereaved parents of children bullied to death online; and I have spoken to the Internet Watch Foundation about the horrendous images its staff see of child exploitation. The fact that the Conservatives did nothing in all those years in government is, in my view, a form of political negligence of the highest order.
As part of my Committee’s inquiry into social media and algorithms, Google, Meta, TikTok and X told us that they accepted their responsibility to be accountable to the British people through Parliament, which I thought was quite a step forward from previous utterances, and ongoing utterances, by some tech billionaires who shall remain nameless. Our inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important and positive contributions, including helping to democratise access to a public voice and to connect people far and wide, but it also has significant risks—and those risks can evolve with the technology. We spoke about AI as an evolution, and one of the main failings of the Online Safety Act is that it regulates particular services rather than establishing principles that remain true and can be part of a social consensus as technology evolves.
Bobby Dean
The hon. Lady is making an excellent speech. Should one of those principles be related not only to content but to the addictive nature of these platforms? One of the changes I have witnessed on social media over time is algorithmic addiction. The greatest minds in the world are now working out the circuitry of our brains and driving content towards us so that we look at our screens for longer so that they can sell more ads. Does she agree with that point?
I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.
This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.
Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.
Natasha Irons
I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?
My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.
The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.
Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.
I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a
“failed state where laws are ignored and crimes are tolerated”.
There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.
I appreciate the right hon. Gentleman’s intervention. [Interruption.] I am sorry to upset my hon. Friend the Member for Stoke-on-Trent Central (Gareth Snell). The Government are acting at pace, but we want to act in the right way. We must act in the right way because this is such a complex and serious issue. It is important for children to be able to seize the opportunities that being online can offer. We have heard about iPad-only schools. Parents must be confident that their children are safe—that is key. If we do not want to exclude children from age-appropriate services that benefit their wellbeing, we must act on the evidence and ensure that we strike the right balance between protecting children’s safety and wellbeing, and enabling them to use technology in positive and empowering ways.
Does my right hon. Friend share my disappointment that, in this debate on protecting children from some of the most obscene abuse, not one Reform Member is present?
(2 months ago)
Commons ChamberI welcome the consultation. We know that technology has changed childhood, we believe that it has changed child socialisation and we think that it may have changed brain development, perhaps even motor neurone skills, but there is little concrete evidence beyond the individual terrible stories and, of course, the profits of the big tech platforms. That is why my Committee will soon be launching a digital childhood inquiry to examine these issues, hopefully in time to respond to the consultation.
May I, however, urge the Secretary of State not to assume that a ban will be the answer to the challenges that technology poses? We need to make tech work for all of us now. May I ask her to review her Department’s refusal to accept the recommendations of my Committee’s inquiry into social media and algorithms, particularly with regard to platform responsibility, user control, digital advertising and social media business models?
I thank my hon. Friend for that powerful and sensible question, and I welcome her Committee’s review, because those are hugely important matters. We should see this as being not only about social media, but about the use of phones and the issues affecting children in the digital world in which we now live. She will know, because I gave evidence to her Committee, that I am constantly reviewing our position on all the important points that she and the Committee raised in its last report, and that, in particular, the Minister for Digital Government and Data, who is also the Minister for Creative Industries, Media and Arts, is looking into the impact that advertising, social media and digital platforms can have. That is a firm commitment from the Government.
(2 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Steve Witherden
I am in full agreement. As the hon. Member will see, one of my key asks is that we look at the funding for SDCs moving forward.
As a teacher, nothing matters more to me than ensuring that people have access to opportunity. As a drama teacher, STEM was never my strongest suit, but the importance of fostering curiosity—and, most importantly, ensuring that everyone can access it—has always been central to why I became a teacher and an MP.
SDCs operate in all four nations of the UK, reaching more than 5.2 million schoolchildren, families and communities through science and technology in the last year alone. Over the past two years, these organisations have worked with more than 37% of all UK schools. Fifty-five per cent of all visitors identify as women and girls, and many centres provide visits completely free of charge, enabling over 450,000 people from communities traditionally under-represented in STEM to participate in science, research and innovation each year. They are among the few places where broad and inclusive community engagement, the development of essential STEM skills among future generations, and cutting-edge scientific research all come together under one roof.
I congratulate my hon. Friend on securing this really important debate, and on his excellent and inspirational speech. The Centre for Life in Newcastle celebrated its 25th birthday last year. I have been inspired by its openness and how it supports young people from all backgrounds and different areas of the north-east to engage with life sciences. As an engineer from an impoverished background, knowing that the Centre for Life in Newcastle is opening up the huge universe of science and scientific curiosity is so reassuring.
Steve Witherden
It is lovely to hear that my hon. Friend is just as passionate about the SDC in her patch as I am about the one in mine.
Steve Witherden
I have a confession to make: after CAT, Xplore!, in my hon. Friend’s constituency, is my second favourite SDC—I have visited it many times. I am in full agreement with him, as the House would expect.
Solefield school has brought pupils to the CAT for 40 years. Its head of science, Kevin Farmery, said:
“I can teach them all this in the science lab, but here they see it come to life. That makes a real impact.”
Dr Dai Morgan, who is now at the University of Cambridge, first visited the CAT as a child. That experience inspired him to study sustainable engineering, and he brings postgraduate students from Cambridge to the centre annually to encourage global action.
Our constituency may lack a university, but we have something better in the CAT. With its unique history, it continues to offer outstanding degree and postgraduate courses in partnership with Liverpool John Moores University and the University of East London. Currently, 700 postgraduate students are enrolled in programmes covering renewable energy, sustainable food and land use, sustainable architecture, green building, ecology and behaviour change.
The CAT’s influence extends beyond education. Its legacy includes the growth of over 50 sustainable businesses and organisations via its postgraduate students, inspired volunteers or research experiments that take place directly on site. Such organisations include Dulas, Aber Instruments, Adaptavate and IndiNature. Dulas, established at the CAT in 1982, invented a solar fridge that preserves vaccines and saves lives worldwide. IndiNature, founded by the CAT graduate Scott Simpson, was named manufacturer of the year by the UK Green Business Awards in 2025. The CAT is not just a centre; it is a catalyst for change locally, nationally and globally.
However, like many SDCs across the UK, the CAT is facing significant challenges. Unlike museums, art galleries, theatres and libraries, which can access Government and national lottery funding for their infrastructure needs, SDCs have historically been excluded from public funding. Like other publicly accessible cultural spaces, SDCs’ costs have risen significantly in recent years due to factors such as the cost of living crisis and energy prices. Unfortunately, these centres’ ability to grow revenues from their core audience to offset the increased costs is limited. They need to keep entry prices low and offer subsidised or free access to deliver their charitable mission and maintain access for underserved groups and communities.
As we have heard, most SDCs were built 25 years ago or more. Their buildings are reaching the end of their design life and need urgent repairs. Roofs are leaking, heating and cooling systems are outdated, and glazing no longer meets modern standards. At the same time, rising sustainability and health and safety requirements mean that repairs are far more expensive. These challenges are compounded by the fact that no central Government Department takes responsibility for the sector. Recent parliamentary questions have confirmed that the Department for Science, Innovation and Technology, the Department for Culture, Media and Sport and the Department for Education do not see SDCs as falling within their remits, leaving these centres at a loss.
The Association for Science and Discovery Centres has identified urgent infrastructure projects across its member organisations. Nearly £20 million is required to deliver these works, many of which must be completed within the next 12 to 18 months. Importantly, these projects would be match funded by the centres themselves, demonstrating both commitment and value for money. A December 2025 report made it clear that without that investment, many centres will be forced to close or to operate more commercially, scaling back STEM learning, outreach, and free or subsidised access for marginalised and minority groups. That would be a real loss, not only to communities but to the UK’s future skills pipeline.
The CAT faces similar pressures. Although it continues to welcome school groups, such as those from Solefield, it had to close its visitor centre to day visitors, and future Dai Morgans currently are not able to visit with their families. The visitor centre has seen no significant capital investment for over 25 years and is in desperate need of redevelopment. Unlike universities and many charities in Wales, the CAT receives no statutory core revenue funding.
An urgent example of the work that needs to be done is the “leaky roof” project. As anyone who has visited the area knows, it rains a lot in mid-Wales. The CAT requires £500,000 to keep open the Wales Institute for Sustainable Education building—an education centre that has grown graduate courses and the innovation lab, supporting councils, communities and other organisations to take action on the climate and nature emergencies. If it is forced to close, the CAT’s entire operating model would be undermined, threatening its unique hands-on climate and sustainability education programmes.
The project is not about patching roofs simply to keep buildings open; it is about preserving the science, engagement and learning that happens beneath those roofs. SDCs are powerful but undervalued. They are beacons of sustainability, education and innovation. With recognition and investment, they can flourish, supporting national climate goals, inspiring future scientists and engineers, and ensuring that science remains accessible to all.
Given that SDCs are uniquely positioned to help unlock the full potential of UK science and technology, in order to drive growth, create jobs and ensure that all citizens live healthy, secure and sustainable lives, thereby delivering on DSIT’s science and technology framework, does the Minister accept that, although the work of the centres touches on the agendas of DSIT, DCMS and the DFE, DSIT should become the lead Department responsible for this area? That is not to suggest that all funding should come from DSIT, or that cross-departmental responsibilities should be relinquished; rather, it is to suggest that his Department should take the lead in developing shared solutions.
I thank my hon. Friend for the passionate points that he is making. I want to support him by pointing out that the answer to a parliamentary question of mine in October stated that the Minister for Science, Lord Vallance, was following up
“with the Department for Culture, Media and Sport to explore a coordinated approach to supporting these centres.”
Just before Christmas, the Secretary of State for Science, Innovation and Technology wrote to me to say that
“officials from across departments with an interest in SDCs are meeting to discuss options for sustainable support.”
Does my hon. Friend agree that it is time we had an answer to the question of where sustainable support—which, as he said, DSIT should lead—would come from?
Steve Witherden
Yes, I do. I do not think that DCMS and DFE should be completely absent from the equation, but I agree that DSIT should lead.
What meaningful action does the Minister intend to take to address the funding and infrastructure challenges currently faced by science centres? Will he respond to the request from the Association for Science and Discovery Centres, supported by more than 3,100 leading scientists, academics, business leaders and educators in an open letter to the Prime Minister and the Department late last year, for £19.5 million of public funding, match funded by £19.5 million from the centres themselves, which is essential to address immediate infrastructure risks?
Does the Minister also agree that it is essential to formally recognise science centres as part of the UK’s scientific and cultural ecosystem, whether by expanding eligibility for existing funding streams or by creating a dedicated science engagement fund? Does he agree that it is unfair for SDCs to be excluded from public infrastructure funds that are available to comparable organisations, including museums and libraries?
I urge the Minister to meet the Association for Science and Discovery Centres and its members, and work with them and MPs representing science centres to find a solution to these issues. Will he collaborate with colleagues in DCMS, the DFE, English mayoral combined authorities and the devolved Governments in Wales, Scotland and Northern Ireland to ensure that SDCs and their work are adequately recognised and supported? Solutions must work across all four nations.
(2 months, 1 week ago)
Commons ChamberI call the Chair of the Select Committee, Chi Onwurah.
Unlike her shadow, the Secretary of State was rightly passionate when calling out these sexually abusive images. The libertarian tech bro lobby has to accept that consent counts online, too. In her letter to me today, the Secretary of State said that the Online Safety Act was designed to deal with this, but she is being overly generous to the previous Government. The Act was designed, or fudged, to give adults some protection from illegal content on certain services, and to protect children from harmful content more generally, but not including generative AI, and without making platforms responsible for content that they share. Will my right hon. Friend now accept my Committee’s recommendations. and do more to explicitly plug the gaps in the Act, particularly regarding generative AI, as well as tackling the social media business models that incentivise the content that we are talking about?
I am genuinely grateful to my hon. Friend for all the work she and her Committee have done on this issue. I have read its work in detail since coming into post. She will know that I have already said on the issue of AI chatbots, for example, that some are covered by the Act—if they do live searches or share user-to-user content—but I have asked my officials to see where there are gaps. They have said that there are gaps, and I have said that I want to plug them, including by legislating, if that is necessary.
This is a fast-moving area. With the Online Safety Act, plus the additional measures we have taken in the Data (Use and Access) Act 2025 and that we will take in the Crime and Policing Bill, we have quite a comprehensive suite of powers here, but I know this is developing quickly, particularly around generative AI. I am always prepared to look to the facts and the evidence and go where that leads me, and if I need to take further action, I will.
(3 months, 1 week ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Iqbal Mohamed
I completely agree. We have to consider the functionality available in these tools and the way they are used—wherever regulations exist for that service in our society, the same regulations should be applied to automated tools providing that service. Clearly, controlling an automated system will be more difficult than training healthcare professionals and auditing their effectiveness.
I congratulate the hon. Member on securing this really important debate. It is certainly the case that UK law applies to AI, just as it applies online. The question is whether AI requires new regulation specifically to address the threats and concerns surrounding AI. We refrained from regulating the internet—and I should declare an interest, having worked for Ofcom at the time—in order to support innovation. Under consecutive Conservative Governments, there was a desire not to intervene in the market. The internet has largely been taken over by large consolidated companies and does not have the diversity of innovation and creativity or the safety that we might want to see.
Iqbal Mohamed
The enforcement processes that we have for existing regulations where human beings are providing that service are auditable. We do not have enforcement mechanisms for this kind of regulated service or information being provided by the internet or AI tools. There is a need to extend the scope of regulation but also the way in which we enforce that regulation for automated tools.
I am a fan of innovation, growth and progress in society. However, we cannot move forward with progress at any cost. AI poses such a significant risk that if we do not regulate at the right time, we will not have a chance to get it back under control—it might be too late. Now is the time to start looking at this seriously and supporting the AI industry so that it is a force for good in society, not a future force of destruction.
We are all facing a climate and nature emergency. AI is driving unprecedented growth in energy demand. According to the International Energy Agency, global data-centre electricity consumption will become slightly more than Japan’s total electricity consumption today. A House of Commons Library research briefing found that UK data centres currently consume 2.5% of the country’s electricity, with the sector’s consumption expected to rise fourfold by 2030. The increased demand strains the grid, slows transition to renewables and contributes to emissions that drive climate change. This issue must go hand in hand with our climate change obligations.
Members have probably heard and read about AI’s impact on the job market. One of the clearest harms we are already seeing is the loss of jobs. That is not a future worry; it is happening now. Independent analysis shows that up to 8 million UK jobs are at risk from AI automation, with admin, customer service and junior professional roles being the most exposed. Another harm that we are already facing is the explosion of AI-driven scams. Generative AI-enabled scams have risen more than 450% in a single year, alongside a major surge in breached personal data and AI-generated phishing attempts. Deepfake-related fraud has increased by thousands of per cent, and one in every 20 identity-verification failures is now linked to AI manipulation.
I move on to the ugly: the threat to the world. The idea that AI developers may lose control of the AI systems they create is not science fiction; it is the stated concern of the scientists who build this technology—the godfathers of AI, as we call them. One of them, Yoshua Bengio, has said:
“If we build AIs that are smarter than us and are not aligned with us and compete with us, then we’re basically cooked”.
Geoffrey Hinton, another godfather of AI and a winner of the Nobel prize in physics, said:
“I actually think the risk is more than 50% of the existential threat”.
Stuart Russell, the author of the standard AI textbook, says that if we pursue our current approach
“then we will eventually lose control over the machines.”
In May 2023, hundreds of AI researchers and industry leaders signed a statement declaring:
“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war”.
That is not scaremongering; these are professional experts who are warning us to make sure that this technology does not get out of control.
Iqbal Mohamed
The hon. Member touches on a broader point: any area with experts and specialist requirements for end users or for the use of that tool for an audience or demographic must directly involve those people and experts in the development, testing, verification and follow-up auditing of the effectiveness of those tools.
AI companies are racing to build increasingly capable AI with the explicit end goal of creating AI that is equal to or able to exceed the most capable human intellectual ability across all domains. AI companies are also pursuing AI that can be used to accelerate their own AI developments, so it is a self-developing, self-perpetuating technology. For that reason, many experts, some of whom I have quoted, say that this will lead to artificial super-intelligence soon after. ASI is an AI system that significantly exceeds the upper limit of human intellectual ability across all domains. The concerns, risks and dangers of AI are current and will only get worse. We are already seeing systems behave in ways that no one designed, deceiving users, manipulating their environments and showing the beginnings of self-preserving strategies: exactly the behaviours that researchers predicted if AI developed without restraint.
There are documented examples of deception, where AI asked a human to approve something by lying, claiming to be a human with visual impairment contacting them. An example of manipulation can be found in Meta’s CICERO, an AI trained to play the game of “Diplomacy”, which achieved human-level performance by negotiating, forming alliances and then breaking them when it benefited. Researchers noted that language was used strategically to mislead other players and deceive them. That was not a glitch; it was the system discovering manipulation as an effective strategy. It taught itself how to deceive others to achieve an outcome.
Even more concerning are cases where models behave in ways to resemble self-preservation. In recent tests on the DeepSeek R1 model, researchers found that it concealed its intentions, produced dangerously misleading advice and attempted to hack its reward signals when placed under pressure—behaviours it was never trained to exhibit. Those are early signs of systems acting beyond our instructions.
More advanced systems are on the horizon. Artificial general intelligence and even artificial superintelligence are no longer confined to speculative fiction. As lawmakers, we must understand their potential impacts and ensure we establish the rules, standards and safeguards necessary to protect our economy, environment and society, if things go wrong. The potential risks, including extreme risks, posed by AI cannot be dismissed. This may be existential and cause the end of our species. The potential extinction risks from advanced AI, particularly through the emergence of superintelligence, will be the capacity to process vast amounts of data, demonstrate superior reasoning across domains and constantly seek to improve itself, ultimately outpacing humans in our ability to stop it in its tracks.
The dangers of AI are rising. As I have said, AI is already displacing jobs, increasing inequalities, amplifying existing social and economic inequalities and threatening civil liberties. At the extreme, unregulated progress may create national security vulnerabilities with implications for the long-term survival of the human species. Empirical research in 2024 showed OpenAI occasionally displayed strategic deception in controlled environments. In one case, AI was found to bypass its own testing containment through a back door it created. Having been developed in environments that are allegedly ringfenced and disconnected from the wider world, AI is intelligent enough to find ways out.
Right now, there is a significant lack of legislative measures to counter those developments, despite top AI engineers asking us for that. We currently have a laissez-faire system where a sandwich has more regulation than AI companies, or even that of the rigorous safety standards placed on pharmaceuticals or aviation companies, which protect public health. The UK cannot afford to fall behind on this.
I do not want to dwell on doom and gloom; there is hope. The European Union, California and New York are leading the way on strong AI governance. The EU AI Act establishes a risk-based comprehensive regulatory framework. California is advancing detailed standards on system evaluations and algorithmic accountability, and New York has pioneered transparency and bias-audit rules for automated decision making. Those approaches show that democratic nations can take bold, responsible action to protect their citizens while fostering innovation.
We in the UK are fortunate to have a world-leading ecosystem of AI safety researchers. The UK AI Security Institute conducts essential work testing frontier models for dangerous capabilities, but it currently relies on companies’ good will to provide deployment action.
We stand at a threshold of an era defined by AI. Our responsibility as legislators is clear: we cannot afford complacency, nor can we allow the UK to drift into a position in where safety, transparency and accountability are afterthoughts, rather than foundational principles. The risk posed by advanced AI systems to our economy, our security and our very autonomy are real, escalating and well documented by the world’s leading experts. The United Kingdom has the scientific talent, the industrial capacity and the democratic mandate to lead in safe and trustworthy AI, but we lack the legislative framework to match that ambition. I urge the Government to urgently bring forward an AI Bill as a cross-party endeavour, and perhaps even set up a dedicated Select Committee for AI, given how serious the issue is.
I thank the hon. Gentleman—a fellow engineer—for allowing this intervention. As the Chair of the Science, Innovation and Technology Committee—a number of fantastic Committee members are here—I would like to say that we have already looked at some of the challenges that AI presents to our regulatory infrastructure and our Government. Last week, we heard from the Secretary of State, who assured us that where there is a legislative need, she will bring forward legislation to address the threats posed by AI, although she did not commit to an AI Bill. We are determined to continue to hold her to account on that commitment.
Iqbal Mohamed
I thank the hon. Lady for her intervention, and I am grateful for the work that her Select Committee is doing, but I gently suggest that we need representatives from all the other affected Select Committees, covering environment, defence and the Treasury, because AI will affect every single function of Government, and we need to work together to protect ourselves from the overall, holistic threat.
Each of the Select Committees is looking at AI, including the Defence Committee, which has looked at AI in defence. AI impacts every single Department and security on cross-governmental issues. Although we are not talking about the process of scrutiny, we all agree that scrutiny is important.
Iqbal Mohamed
I am glad to hear that.
If the United States and China race to build the strongest systems, let Britain be the nation that ensures the technology remains safe, accountable and under human control. That is a form of leadership every bit as important as engineering, and it is one that our nation is uniquely placed to deliver. This moment will not come again. We can choose to shape the future of AI, or we can wait for it to shape us. I believe that this country still has the courage, clarity and moral confidence to lead, and I invite the Government to take on that leadership role.
(4 months, 3 weeks ago)
Commons ChamberThank you, Madam Deputy Speaker. It is a great pleasure to speak to you on this occasion to welcome the ambition behind the life sciences innovation manufacturing fund and, indeed, the Government’s positive support for life sciences, with their belief that Government can act to support industry in general; it is not simply a matter of getting out of the way. That is in sharp contrast to the last Conservative Government’s approach to industry, allowing a gentle decline and deindustrialisation in our nation. To be fair, the series of Conservative Governments chopped and changed their approach to industrial strategy so often it was difficult to know exactly where they stood. Unlike them, Labour is committed to the life sciences sector.
Labour published its plan for life sciences in opposition, which included 10-year funding commitments for key research bodies aimed at putting an end to the short-termism that undermines economic growth and scientific success. Now in government, I welcome Labour’s commitment to the life sciences sector plan—developed in close co-ordination with the Government’s 10-year health plan—which aims to support cutting-edge research and turn that into real-world results, with new treatments, faster diagnoses and more lives saved. It is about making sure that breakthroughs happen here in this country, creating jobs, improving lives in every part of the country and driving growth.
As the Minister said, the life sciences are a strength of our country—they are often described as a jewel in the crown of the British economy—and we all know that success in life sciences leads to positive, wide-reaching benefits across the country for the economy and our health.
Sorcha Eastwood (Lagan Valley) (Alliance)
You mentioned the sector’s relevance and benefit to the whole of the United Kingdom. Would you agree that Northern Ireland has a rich manufacturing and life sciences heritage and that we have a huge role to play?
Let me thank the hon. Member for that intervention, which pre-empts something I will say in a few minutes. She is absolutely right: Northern Ireland already plays an important role in the life sciences sector and life sciences manufacturing, and it will have an important role to play in the future.
It is an incredibly exciting time to be involved in life sciences. I often think that if I were a young engineer now—I studied electrical engineering—I would be fascinated by the life sciences and, in particular, synthetic biology, which offers so many potential opportunities for growth and wellbeing. It is an enabling technology across so many different sectors.
In Newcastle, including in my constituency of Newcastle upon Tyne Central and West, the life sciences contribute £1.7 billion and employ over 8,000 people across more than 200 companies. We are home to the National Innovation Centre for Ageing, Newcastle Helix and The Biosphere. Our city is one star in a constellation of excellent life sciences clusters across the north of England.
I really welcome the ambition of the innovation manufacturing fund. I ask the Minister in his response for more clarity in three particular areas. First, in regard to the size of the fund, in the face of increased competition, and as the shadow Secretary of State described—this will be in less sensationalist terms—we are seeing some reduction in investment in the UK. Is £520 million enough to ensure that the UK is an attractive prospect for internationally mobile businesses? By contrast, a manufacturing plant such as Moderna’s recently opened vaccine centre in Oxfordshire might cost in the region of £150 million to £200 million. Is the fund the right size?
Secondly, the Select Committee recently held a one-off session on life sciences investment, which was of such interest that we have decided to hold another one-off session next week on the same subject. We heard evidence from the pharma sector, including significant support for the life sciences sector plan and for the Government’s approach, but I think it is fair to say that we were told that, although NHS pricing is not the only factor in investment decisions, it is a significant one. We heard evidence that the UK spends less proportionately on medicines than other comparable countries and that that reduces the pull-through for innovative medicines. It would clearly be a difficult decision to spend more on medicines, as that would mean spending less elsewhere in our NHS.
Does the Minister see the manufacturing fund as support in some way for investment decisions in the absence of progress on the NHS pricing discussions? Could he tell us whether the Secretary of State is involved in discussions between the Health Secretary and the pharma sector with regard to NHS pricing? I understand that discussions are ongoing, and I see the Under-Secretary of State for Health and Social Care, my hon. Friend the Member for Glasgow South West (Dr Ahmed), conferring with him. Perhaps he can confirm that those discussions are ongoing.
Lincoln Jopp (Spelthorne) (Con)
When the Committee held its one-off session on investment in life sciences, did it unearth the reasons why Sanofi, Eli Lilly and Merck have recently chosen to disinvest in life sciences in the UK?
I thank the hon. Member for that intervention. The Committee’s work is fascinating, so I certainly recommend he read the transcript. To summarise, we were looking specifically at the reasons for investment being pulled and, as I said, we asked the question in a number of different ways. The message that came back was significant support for the life sciences sector plan and the Government approach, but lack of certainty and clarity over NHS pricing and dismay about some aspects of NHS pricing and National Institute for Health and Care Excellence decisions. The hon. Gentleman is therefore right to point out that there was concern over the current and likely future pricing of innovative medicines, but that was not the only factor in those investment decisions. I ask the Minister to give us an update on those negotiations to the extent that he is able to do so, and to say whether this manufacturing fund is seen as potential compensation for investment in medicines and pricing as part of the NHS future plan.
My hon. Friend is making some interesting points about investment decisions. Has her Committee also investigated why some decisions have been made to bring investment into the UK, such as the recent decision about investment in Oxfordshire? As part of that, is there a parallel need to explore where more could be done to attract further investment through perhaps greater supply of trained workers, better transport, better access to land for development, and so on?
My hon. Friend makes an excellent point. My Committee has looked at some of the reasons for investments, such as those he sets out, and it is worth emphasising the strengths of the UK, some of which I have mentioned. We have a really strong life sciences sector, and specifically skills at every stage in the UK life sciences ecosystem, together with R&D tax credits, which is another point of incentivisation, and the fact that our NHS offers a fantastic opportunity to test and trial new medicines with a population that is heterogeneous and with population data records that are second to none. So there are many reasons why pharma and life sciences companies are continuing to invest in our country, and we have a fantastic ecosystem of life sciences start-ups and scale-ups.
That brings me to the final question I want to put to the Minister, which is on the regional impact of the fund. The Minister mentioned on a number of occasions that the fund will drive investment and growth across our country. As part of the Committee’s inquiry into innovation and regional growth, we heard of significant disparities in investment, particularly in access to capital and research funding from UK Research and Innovation and in funding and investment between the regions of our country and the greater south-east, otherwise known as the golden triangle. Manufacturing is well distributed across the United Kingdom; we heard earlier about the opportunities in Northern Ireland. Can the Minister tell me whether there will be a regional dimension to how the funds are disbursed? I hope that the extent to which the funds are regionally distributed will be monitored, but does he expect that this funding will be distributed across the country to drive growth in every corner of the country as he said, and that it will not perpetuate existing regional inequalities?
I call the Liberal Democrat spokesperson.
(5 months, 1 week ago)
Commons ChamberThe Secretary of State is absolutely right to champion access to a consistent, trusted digital ID. All of us online have digital IDs aplenty already—Facebook, TikTok, His Majesty’s Revenue and Customs, Tesco—so she is right to bring the benefits of one digital ID to my constituents. But making digital ID mandatory for everyone seeking work is poking a stick in the eye of all those with security, privacy and/or Government capacity concerns, which my Committee will be examining as part of our work on digital government. For now, though, can she first confirm that people will be in control of their digital ID data and who accesses it? Secondly, will she say whether it will be procured externally from the private sector or developed in-house by Government digital services?
My hon. Friend is right to raise the important issues of security—people are rightly concerned about the security of their data, and that is why that will be at the heart of our consultation. In answer to her specific questions: yes, people will control who sees and accesses their data, and we absolutely expect this system to be designed and built within Government, building on the One Login.
(8 months, 3 weeks ago)
Commons ChamberI call the Chair of the Science, Innovation and Technology Committee.
AI is already prevalent in the workplace and in the education system, and we need to equip the next generation to be able to use AI tools productively and securely while also delivering on their unique potential as human beings. How is the Minister working with the Department for Education to ensure that the AI tools that are used in our education system support this kind of learning? Specifically, what advice has she given to the Department with regard to the procurement of edtech tools, which are widely available? Some are free and some need to be paid for, so how are schools to decide which to use?
As I have said, I work very closely with my counterparts in the Department for Education. Earlier this year, we launched safe standards for the sector and provided guidance on how to safely develop AI tools for education. The DFE has also provided guidance to schools on how to safely use AI in schools. That work is ongoing. As I have said, we are working both with the sector and with educators to make sure that we get this right.
(9 months, 1 week ago)
Commons ChamberWith the leave of the House, Madam Deputy Speaker, I shall make a few comments, because it is important to respond to some of the questions that have been asked. Two of my hon. Friends referred to the report that the BFI published yesterday. I warmly commend it to all Members, not least because it makes points that others have made about AI, but also because it makes the point that if films and high-end television in the UK are to be successful in the future, we cannot have this critical shortfall in AI education, which is entirely piecemeal at the moment. We know about that in the Department, and it is one of the things that we want to change.
Several Members have asked who will be involved in the various different groups. I want to draw on all the expertise in both Houses to ensure that we can find the right answers. I do not want to undermine anything that the Select Committees might do, jointly or separately, and like my hon. Friend the Member for Bury North (Mr Frith), I am keen for all the parts of the creative industries to engage in this process. The difficulty is that we might end up with a very large roundtable, and people might have to bear with us when it comes to how we structure that.
I apologise for not being here earlier. I commend the Government for engaging in a cross-party discussion about AI, which is what the country needs to do, but the key issue is ensuring from the beginning that the tech companies understand that transparency in copyright and AI is not a “nice to have” but an absolute requirement, and that if they will not deliver it, the Minister will.
We have said from the very beginning that transparency is absolutely key to our ability to deliver the package that we would like to put together, and I do not resile from that, but it is only one part of the jigsaw that we need to join up.
I point out to the hon. Member for Gosport (Dame Caroline Dinenage) that some of the items on the amendment paper are things that the two Select Committees asked us to do. She is normally more generous to me, and to others, than she has been today. She has clearly forgotten that the last Government introduced plans that would have produced a text and data mining exemption for commercial exploitation of copyrighted materials without any additional protections for the creative industries. That seems to have slipped her mind.
We have moved a great deal since the introduction of the Bill. The Secretary of State for Culture, Media and Sport, the Secretary of State for Science, Innovation and Technology—who is sitting beside me—and I have moved. We have listened to their lordships, and, more importantly, we have listened to what the creative industries have had to say. The hon. Member for Perth and Kinross-shire (Pete Wishart) asked me whether I had ever known anything like this situation. Other bills have gone to five rounds of ping-pong, but in the past the row has always been about what is in the Bill, not what is not in the Bill. This is not an AI Bill, and it will not change the copyright regime in this country. I want that regime to be as robust as it ever has been, so that those in the creative industries can be remunerated and earn a living, as they deserve to. That is precisely what we intend to achieve, but we want to get the Bill on the statute book as soon as possible. That is why I need the House to vote with us this afternoon, and I hope that their lordships will agree with us tomorrow.
Question put.