(1 week, 1 day ago)
Commons ChamberI thank the hon. Member for his point of order. The motion on the Order Paper is perfectly orderly, so Members will be invited to vote on that, not on the substance of any Bill that might come on 9 March. I think it is important that the House is clear on that.
Further to that point of order, Madam Deputy Speaker. How can I assess what is orderly for my contribution to the debate given that the substance of the motion is about process? To be frank, I do not want to speak about process; I want to speak about protections for children.
The motion is to give consideration to a Bill on the specific matter which has been outlined clearly on the Order Paper: “Protections for children from online harms”. I reassure the hon. Lady that any contribution she chooses to make on that matter would be in order.
I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action.
I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.
The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows.
As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform, which shall remain nameless, around 2005. The company executive commented that they had come to the UK from silicon valley on a six-month contract to sort out Government affairs, and they could not understand why, two years later, discussions were still ongoing. Did we not realise that Government had no role in what they did?
I say that to illustrate that tech platforms have their origins in a libertarian, small/no-government tech bro bubble that has spread globally. TikTok, as a Chinese company, has a different background, but public accountability is not necessarily part of it. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media and life online has exploded. That is why I consider the Tory position in this debate to be a superb example of hypocrisy.
Monica Harding (Esher and Walton) (LD)
The hon. Lady is making a powerful speech about the evolution of social media platforms. I have four children; the first was born in 2004 and the last was born in 2011, so their births have spanned that evolution. Facebook began in 2004; TikTok began in 2016. If that evolution was the industrial revolution, we would be around the spinning jenny stage, with AI chatbots the next destination. Those chatbots are terribly dangerous for our children, and we need to regulate them now. That should be within the Online Safety Act.
I agree that AI chatbots are a further evolution, and I think we should learn from the lack of effective regulation under the Conservatives during that critical period in the evolution of the internet in how we approach AI. I agree with the hon. Lady that AI chatbots should be brought into the regulatory environment of the Online Safety Act.
My hon. Friend the Chair of the Select Committee is making an excellent speech. Her background in this area is really showing in the detail with which she is exploring these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face. For me, part of the importance of the consultation is to allow parents to think more deeply about this difficult issue; there are often different opinions from campaigners who have had the most painful experiences.
My hon. Friend makes an excellent point. It is for that exact reason that I support a consultation: this is part of a debate, and we all need to improve our understanding of the impacts of this technology. Parents are in a difficult position. I do not believe parents should have to be technology experts in order to give their children the best start in life, but unfortunately there is so much pressure in the online world that that seems to be the case right now, and that is why it is right that Government take action and consult on the action they take.
Let us think about the evolution of these technologies. I remember that when I joined Facebook in 2005 I had to use my university email address to join—that meant I had to be over 18. Some 20 years later, 13-year-olds and younger are having their lives and brains formed by almost uninhibited access to social media. In the UK, the number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter Molly took her own life at the age of 14 following exposure to self-harm content online; I have spoken to the bereaved parents of children bullied to death online; and I have spoken to the Internet Watch Foundation about the horrendous images its staff see of child exploitation. The fact that the Conservatives did nothing in all those years in government is, in my view, a form of political negligence of the highest order.
As part of my Committee’s inquiry into social media and algorithms, Google, Meta, TikTok and X told us that they accepted their responsibility to be accountable to the British people through Parliament, which I thought was quite a step forward from previous utterances, and ongoing utterances, by some tech billionaires who shall remain nameless. Our inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important and positive contributions, including helping to democratise access to a public voice and to connect people far and wide, but it also has significant risks—and those risks can evolve with the technology. We spoke about AI as an evolution, and one of the main failings of the Online Safety Act is that it regulates particular services rather than establishing principles that remain true and can be part of a social consensus as technology evolves.
Bobby Dean
The hon. Lady is making an excellent speech. Should one of those principles be related not only to content but to the addictive nature of these platforms? One of the changes I have witnessed on social media over time is algorithmic addiction. The greatest minds in the world are now working out the circuitry of our brains and driving content towards us so that we look at our screens for longer so that they can sell more ads. Does she agree with that point?
I really thank the hon. Member for that intervention, because that is exactly one of the recommendations of the Committee’s inquiry. As he says, the advertisement-based business models of most social media companies mean that they promote addictive content regardless of authenticity. This spills out across the entire internet via the unclear, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media, as we saw during the 2024 unrest following the horrendous Southport attacks.
This is not just a social media problem, though. It is a systemic issue that promotes harmful content and undermines public trust. The Committee identified five key principles that we believe are crucial for building public trust. The first is public safety. Public safety matters; I hope it is not necessary to debate that. The second is free and safe expression, which is also very important. The third is responsibility on the part of the platforms. Right now, they have no legal responsibility for the content they amplify; they just have to follow their own processes in certain specific cases. Our fourth principle involves control, and the fifth and final principle is transparency. We made detailed recommendations on regulating the advertising-based business model so that amplification would not be incentivised in the way that was outlined by the hon. Member for Carshalton and Wallington (Bobby Dean). We also recommended a right to reset—the right of a person to remove their data from any algorithm.
Our report came out not long before the Minister took up his position. The Government accepted all our conclusions but none of our recommendations. I urge them to look again at our recommendations and to consider implementing them, or at least to respond and tell me why they are still not to be implemented. I welcome the Government’s recent actions and interventions and their readiness to intervene. As I said, the consultation is critical. I welcome the desire to promote a consensus and to take measures to ensure swift delivery of the consultation conclusions through the Children’s Wellbeing and Schools Bill. The consideration of the inclusion of AI chatbots is important, as is addressing the risky features in certain models, as well as providing support for bereaved parents. The Committee looks forward to working with the Government to try to achieve their aims. We need evidence to drive policy and regulation based on principles that the public can have confidence in.
Natasha Irons
I wanted to intervene on the point about principles, content and responsibility. I worked for Channel 4 before I came to this place, and we were regulated by Ofcom. Channel 4 did not create its own content, but was responsible for the editorialisation of that content. It was beholden to certain standards. Does she agree that we should be holding these media companies—they are not now “new media” companies, but legacy media companies—just as responsible for the content they put out on their platforms as any broadcaster?
My hon. Friend makes an important point; the insight she brings from her career in the media is critical. For many years, while the platforms were just that—platforms on which other people placed content—there was an argument that they should not be regulated and that they did not have a responsibility for the content on them, but they are at the very least active curators of that content now. Algorithms effectively form digital twins of individuals and then drive individualised content at them. That requires a responsibility. The time is right, as our Committee recommended, to ensure that platforms have responsibility for their content.
The Science, Innovation and Technology Committee will be holding a one-off session on social media age restrictions on 11 March to feed into the Government’s consultation on measures to keep children safe online and to hear from social media companies on their progress in the last year. We will also gauge the strength of the evidence for and against an age-based ban on social media, as well as any evidence relating to proposed alternatives to a ban. In doing so, we will hear from experts and representatives of those with direct experience of harms. We want to hear from both sides of the debate in the UK and will be seeking evidence from Australia on the first few months of the ban that is already in force there. We will be hearing from major social media and technology companies in a follow-up to our algorithms and misinformation inquiry, and we will ask for their views on the proposed age limits.
Finally, the work on social media age restrictions will feed into a larger inquiry on the neuroscience of digital childhood, which we will launch in the coming weeks. We want to find out how young people spending their formative years online affects their brains and what the Government should do to protect them from any negative impact. That could cover the impact of social media and other screentime on brain development, behaviour, and physical and mental health, whether positive or negative. It could also cover the physiological impact on eye development, the impact on socialisation and what actions Governments should take. There is a consensus on the need to do something, but not on what needs to be done. That is why we are seeking to provide evidence.
I always say to the platform companies that the opposite of regulation is not no regulation, but bad regulation. More regulation is coming. Several US states, such as California, have brought in new regulation on big tech. The Spanish Prime Minister has called social media a
“failed state where laws are ignored and crimes are tolerated”.
There is also the increasingly significant issue of technology sovereignty and whether we are too dependent on foreign companies for our online environment. I call myself a tech evangelist, and I am, but I also know how much an engineer costs. The starting salary of an AI engineer—if companies can find one—is well over £100,000 a year. Tech companies are not going to put them to work on protecting and keeping our children safe unless the House puts the right incentives in place. With all due respect to the Minister and the Online Safety Act, which he inherited, they are not in place now.
I appreciate the right hon. Gentleman’s intervention. [Interruption.] I am sorry to upset my hon. Friend the Member for Stoke-on-Trent Central (Gareth Snell). The Government are acting at pace, but we want to act in the right way. We must act in the right way because this is such a complex and serious issue. It is important for children to be able to seize the opportunities that being online can offer. We have heard about iPad-only schools. Parents must be confident that their children are safe—that is key. If we do not want to exclude children from age-appropriate services that benefit their wellbeing, we must act on the evidence and ensure that we strike the right balance between protecting children’s safety and wellbeing, and enabling them to use technology in positive and empowering ways.
Does my right hon. Friend share my disappointment that, in this debate on protecting children from some of the most obscene abuse, not one Reform Member is present?
(1 month, 1 week ago)
Commons ChamberI welcome the consultation. We know that technology has changed childhood, we believe that it has changed child socialisation and we think that it may have changed brain development, perhaps even motor neurone skills, but there is little concrete evidence beyond the individual terrible stories and, of course, the profits of the big tech platforms. That is why my Committee will soon be launching a digital childhood inquiry to examine these issues, hopefully in time to respond to the consultation.
May I, however, urge the Secretary of State not to assume that a ban will be the answer to the challenges that technology poses? We need to make tech work for all of us now. May I ask her to review her Department’s refusal to accept the recommendations of my Committee’s inquiry into social media and algorithms, particularly with regard to platform responsibility, user control, digital advertising and social media business models?
I thank my hon. Friend for that powerful and sensible question, and I welcome her Committee’s review, because those are hugely important matters. We should see this as being not only about social media, but about the use of phones and the issues affecting children in the digital world in which we now live. She will know, because I gave evidence to her Committee, that I am constantly reviewing our position on all the important points that she and the Committee raised in its last report, and that, in particular, the Minister for Digital Government and Data, who is also the Minister for Creative Industries, Media and Arts, is looking into the impact that advertising, social media and digital platforms can have. That is a firm commitment from the Government.
(1 month, 2 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Steve Witherden
I am in full agreement. As the hon. Member will see, one of my key asks is that we look at the funding for SDCs moving forward.
As a teacher, nothing matters more to me than ensuring that people have access to opportunity. As a drama teacher, STEM was never my strongest suit, but the importance of fostering curiosity—and, most importantly, ensuring that everyone can access it—has always been central to why I became a teacher and an MP.
SDCs operate in all four nations of the UK, reaching more than 5.2 million schoolchildren, families and communities through science and technology in the last year alone. Over the past two years, these organisations have worked with more than 37% of all UK schools. Fifty-five per cent of all visitors identify as women and girls, and many centres provide visits completely free of charge, enabling over 450,000 people from communities traditionally under-represented in STEM to participate in science, research and innovation each year. They are among the few places where broad and inclusive community engagement, the development of essential STEM skills among future generations, and cutting-edge scientific research all come together under one roof.
I congratulate my hon. Friend on securing this really important debate, and on his excellent and inspirational speech. The Centre for Life in Newcastle celebrated its 25th birthday last year. I have been inspired by its openness and how it supports young people from all backgrounds and different areas of the north-east to engage with life sciences. As an engineer from an impoverished background, knowing that the Centre for Life in Newcastle is opening up the huge universe of science and scientific curiosity is so reassuring.
Steve Witherden
It is lovely to hear that my hon. Friend is just as passionate about the SDC in her patch as I am about the one in mine.
Steve Witherden
I have a confession to make: after CAT, Xplore!, in my hon. Friend’s constituency, is my second favourite SDC—I have visited it many times. I am in full agreement with him, as the House would expect.
Solefield school has brought pupils to the CAT for 40 years. Its head of science, Kevin Farmery, said:
“I can teach them all this in the science lab, but here they see it come to life. That makes a real impact.”
Dr Dai Morgan, who is now at the University of Cambridge, first visited the CAT as a child. That experience inspired him to study sustainable engineering, and he brings postgraduate students from Cambridge to the centre annually to encourage global action.
Our constituency may lack a university, but we have something better in the CAT. With its unique history, it continues to offer outstanding degree and postgraduate courses in partnership with Liverpool John Moores University and the University of East London. Currently, 700 postgraduate students are enrolled in programmes covering renewable energy, sustainable food and land use, sustainable architecture, green building, ecology and behaviour change.
The CAT’s influence extends beyond education. Its legacy includes the growth of over 50 sustainable businesses and organisations via its postgraduate students, inspired volunteers or research experiments that take place directly on site. Such organisations include Dulas, Aber Instruments, Adaptavate and IndiNature. Dulas, established at the CAT in 1982, invented a solar fridge that preserves vaccines and saves lives worldwide. IndiNature, founded by the CAT graduate Scott Simpson, was named manufacturer of the year by the UK Green Business Awards in 2025. The CAT is not just a centre; it is a catalyst for change locally, nationally and globally.
However, like many SDCs across the UK, the CAT is facing significant challenges. Unlike museums, art galleries, theatres and libraries, which can access Government and national lottery funding for their infrastructure needs, SDCs have historically been excluded from public funding. Like other publicly accessible cultural spaces, SDCs’ costs have risen significantly in recent years due to factors such as the cost of living crisis and energy prices. Unfortunately, these centres’ ability to grow revenues from their core audience to offset the increased costs is limited. They need to keep entry prices low and offer subsidised or free access to deliver their charitable mission and maintain access for underserved groups and communities.
As we have heard, most SDCs were built 25 years ago or more. Their buildings are reaching the end of their design life and need urgent repairs. Roofs are leaking, heating and cooling systems are outdated, and glazing no longer meets modern standards. At the same time, rising sustainability and health and safety requirements mean that repairs are far more expensive. These challenges are compounded by the fact that no central Government Department takes responsibility for the sector. Recent parliamentary questions have confirmed that the Department for Science, Innovation and Technology, the Department for Culture, Media and Sport and the Department for Education do not see SDCs as falling within their remits, leaving these centres at a loss.
The Association for Science and Discovery Centres has identified urgent infrastructure projects across its member organisations. Nearly £20 million is required to deliver these works, many of which must be completed within the next 12 to 18 months. Importantly, these projects would be match funded by the centres themselves, demonstrating both commitment and value for money. A December 2025 report made it clear that without that investment, many centres will be forced to close or to operate more commercially, scaling back STEM learning, outreach, and free or subsidised access for marginalised and minority groups. That would be a real loss, not only to communities but to the UK’s future skills pipeline.
The CAT faces similar pressures. Although it continues to welcome school groups, such as those from Solefield, it had to close its visitor centre to day visitors, and future Dai Morgans currently are not able to visit with their families. The visitor centre has seen no significant capital investment for over 25 years and is in desperate need of redevelopment. Unlike universities and many charities in Wales, the CAT receives no statutory core revenue funding.
An urgent example of the work that needs to be done is the “leaky roof” project. As anyone who has visited the area knows, it rains a lot in mid-Wales. The CAT requires £500,000 to keep open the Wales Institute for Sustainable Education building—an education centre that has grown graduate courses and the innovation lab, supporting councils, communities and other organisations to take action on the climate and nature emergencies. If it is forced to close, the CAT’s entire operating model would be undermined, threatening its unique hands-on climate and sustainability education programmes.
The project is not about patching roofs simply to keep buildings open; it is about preserving the science, engagement and learning that happens beneath those roofs. SDCs are powerful but undervalued. They are beacons of sustainability, education and innovation. With recognition and investment, they can flourish, supporting national climate goals, inspiring future scientists and engineers, and ensuring that science remains accessible to all.
Given that SDCs are uniquely positioned to help unlock the full potential of UK science and technology, in order to drive growth, create jobs and ensure that all citizens live healthy, secure and sustainable lives, thereby delivering on DSIT’s science and technology framework, does the Minister accept that, although the work of the centres touches on the agendas of DSIT, DCMS and the DFE, DSIT should become the lead Department responsible for this area? That is not to suggest that all funding should come from DSIT, or that cross-departmental responsibilities should be relinquished; rather, it is to suggest that his Department should take the lead in developing shared solutions.
I thank my hon. Friend for the passionate points that he is making. I want to support him by pointing out that the answer to a parliamentary question of mine in October stated that the Minister for Science, Lord Vallance, was following up
“with the Department for Culture, Media and Sport to explore a coordinated approach to supporting these centres.”
Just before Christmas, the Secretary of State for Science, Innovation and Technology wrote to me to say that
“officials from across departments with an interest in SDCs are meeting to discuss options for sustainable support.”
Does my hon. Friend agree that it is time we had an answer to the question of where sustainable support—which, as he said, DSIT should lead—would come from?
Steve Witherden
Yes, I do. I do not think that DCMS and DFE should be completely absent from the equation, but I agree that DSIT should lead.
What meaningful action does the Minister intend to take to address the funding and infrastructure challenges currently faced by science centres? Will he respond to the request from the Association for Science and Discovery Centres, supported by more than 3,100 leading scientists, academics, business leaders and educators in an open letter to the Prime Minister and the Department late last year, for £19.5 million of public funding, match funded by £19.5 million from the centres themselves, which is essential to address immediate infrastructure risks?
Does the Minister also agree that it is essential to formally recognise science centres as part of the UK’s scientific and cultural ecosystem, whether by expanding eligibility for existing funding streams or by creating a dedicated science engagement fund? Does he agree that it is unfair for SDCs to be excluded from public infrastructure funds that are available to comparable organisations, including museums and libraries?
I urge the Minister to meet the Association for Science and Discovery Centres and its members, and work with them and MPs representing science centres to find a solution to these issues. Will he collaborate with colleagues in DCMS, the DFE, English mayoral combined authorities and the devolved Governments in Wales, Scotland and Northern Ireland to ensure that SDCs and their work are adequately recognised and supported? Solutions must work across all four nations.
(1 month, 2 weeks ago)
Commons ChamberI call the Chair of the Select Committee, Chi Onwurah.
Unlike her shadow, the Secretary of State was rightly passionate when calling out these sexually abusive images. The libertarian tech bro lobby has to accept that consent counts online, too. In her letter to me today, the Secretary of State said that the Online Safety Act was designed to deal with this, but she is being overly generous to the previous Government. The Act was designed, or fudged, to give adults some protection from illegal content on certain services, and to protect children from harmful content more generally, but not including generative AI, and without making platforms responsible for content that they share. Will my right hon. Friend now accept my Committee’s recommendations. and do more to explicitly plug the gaps in the Act, particularly regarding generative AI, as well as tackling the social media business models that incentivise the content that we are talking about?
I am genuinely grateful to my hon. Friend for all the work she and her Committee have done on this issue. I have read its work in detail since coming into post. She will know that I have already said on the issue of AI chatbots, for example, that some are covered by the Act—if they do live searches or share user-to-user content—but I have asked my officials to see where there are gaps. They have said that there are gaps, and I have said that I want to plug them, including by legislating, if that is necessary.
This is a fast-moving area. With the Online Safety Act, plus the additional measures we have taken in the Data (Use and Access) Act 2025 and that we will take in the Crime and Policing Bill, we have quite a comprehensive suite of powers here, but I know this is developing quickly, particularly around generative AI. I am always prepared to look to the facts and the evidence and go where that leads me, and if I need to take further action, I will.
(2 months, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Iqbal Mohamed
I completely agree. We have to consider the functionality available in these tools and the way they are used—wherever regulations exist for that service in our society, the same regulations should be applied to automated tools providing that service. Clearly, controlling an automated system will be more difficult than training healthcare professionals and auditing their effectiveness.
I congratulate the hon. Member on securing this really important debate. It is certainly the case that UK law applies to AI, just as it applies online. The question is whether AI requires new regulation specifically to address the threats and concerns surrounding AI. We refrained from regulating the internet—and I should declare an interest, having worked for Ofcom at the time—in order to support innovation. Under consecutive Conservative Governments, there was a desire not to intervene in the market. The internet has largely been taken over by large consolidated companies and does not have the diversity of innovation and creativity or the safety that we might want to see.
Iqbal Mohamed
The enforcement processes that we have for existing regulations where human beings are providing that service are auditable. We do not have enforcement mechanisms for this kind of regulated service or information being provided by the internet or AI tools. There is a need to extend the scope of regulation but also the way in which we enforce that regulation for automated tools.
I am a fan of innovation, growth and progress in society. However, we cannot move forward with progress at any cost. AI poses such a significant risk that if we do not regulate at the right time, we will not have a chance to get it back under control—it might be too late. Now is the time to start looking at this seriously and supporting the AI industry so that it is a force for good in society, not a future force of destruction.
We are all facing a climate and nature emergency. AI is driving unprecedented growth in energy demand. According to the International Energy Agency, global data-centre electricity consumption will become slightly more than Japan’s total electricity consumption today. A House of Commons Library research briefing found that UK data centres currently consume 2.5% of the country’s electricity, with the sector’s consumption expected to rise fourfold by 2030. The increased demand strains the grid, slows transition to renewables and contributes to emissions that drive climate change. This issue must go hand in hand with our climate change obligations.
Members have probably heard and read about AI’s impact on the job market. One of the clearest harms we are already seeing is the loss of jobs. That is not a future worry; it is happening now. Independent analysis shows that up to 8 million UK jobs are at risk from AI automation, with admin, customer service and junior professional roles being the most exposed. Another harm that we are already facing is the explosion of AI-driven scams. Generative AI-enabled scams have risen more than 450% in a single year, alongside a major surge in breached personal data and AI-generated phishing attempts. Deepfake-related fraud has increased by thousands of per cent, and one in every 20 identity-verification failures is now linked to AI manipulation.
I move on to the ugly: the threat to the world. The idea that AI developers may lose control of the AI systems they create is not science fiction; it is the stated concern of the scientists who build this technology—the godfathers of AI, as we call them. One of them, Yoshua Bengio, has said:
“If we build AIs that are smarter than us and are not aligned with us and compete with us, then we’re basically cooked”.
Geoffrey Hinton, another godfather of AI and a winner of the Nobel prize in physics, said:
“I actually think the risk is more than 50% of the existential threat”.
Stuart Russell, the author of the standard AI textbook, says that if we pursue our current approach
“then we will eventually lose control over the machines.”
In May 2023, hundreds of AI researchers and industry leaders signed a statement declaring:
“Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war”.
That is not scaremongering; these are professional experts who are warning us to make sure that this technology does not get out of control.
Iqbal Mohamed
The hon. Member touches on a broader point: any area with experts and specialist requirements for end users or for the use of that tool for an audience or demographic must directly involve those people and experts in the development, testing, verification and follow-up auditing of the effectiveness of those tools.
AI companies are racing to build increasingly capable AI with the explicit end goal of creating AI that is equal to or able to exceed the most capable human intellectual ability across all domains. AI companies are also pursuing AI that can be used to accelerate their own AI developments, so it is a self-developing, self-perpetuating technology. For that reason, many experts, some of whom I have quoted, say that this will lead to artificial super-intelligence soon after. ASI is an AI system that significantly exceeds the upper limit of human intellectual ability across all domains. The concerns, risks and dangers of AI are current and will only get worse. We are already seeing systems behave in ways that no one designed, deceiving users, manipulating their environments and showing the beginnings of self-preserving strategies: exactly the behaviours that researchers predicted if AI developed without restraint.
There are documented examples of deception, where AI asked a human to approve something by lying, claiming to be a human with visual impairment contacting them. An example of manipulation can be found in Meta’s CICERO, an AI trained to play the game of “Diplomacy”, which achieved human-level performance by negotiating, forming alliances and then breaking them when it benefited. Researchers noted that language was used strategically to mislead other players and deceive them. That was not a glitch; it was the system discovering manipulation as an effective strategy. It taught itself how to deceive others to achieve an outcome.
Even more concerning are cases where models behave in ways to resemble self-preservation. In recent tests on the DeepSeek R1 model, researchers found that it concealed its intentions, produced dangerously misleading advice and attempted to hack its reward signals when placed under pressure—behaviours it was never trained to exhibit. Those are early signs of systems acting beyond our instructions.
More advanced systems are on the horizon. Artificial general intelligence and even artificial superintelligence are no longer confined to speculative fiction. As lawmakers, we must understand their potential impacts and ensure we establish the rules, standards and safeguards necessary to protect our economy, environment and society, if things go wrong. The potential risks, including extreme risks, posed by AI cannot be dismissed. This may be existential and cause the end of our species. The potential extinction risks from advanced AI, particularly through the emergence of superintelligence, will be the capacity to process vast amounts of data, demonstrate superior reasoning across domains and constantly seek to improve itself, ultimately outpacing humans in our ability to stop it in its tracks.
The dangers of AI are rising. As I have said, AI is already displacing jobs, increasing inequalities, amplifying existing social and economic inequalities and threatening civil liberties. At the extreme, unregulated progress may create national security vulnerabilities with implications for the long-term survival of the human species. Empirical research in 2024 showed OpenAI occasionally displayed strategic deception in controlled environments. In one case, AI was found to bypass its own testing containment through a back door it created. Having been developed in environments that are allegedly ringfenced and disconnected from the wider world, AI is intelligent enough to find ways out.
Right now, there is a significant lack of legislative measures to counter those developments, despite top AI engineers asking us for that. We currently have a laissez-faire system where a sandwich has more regulation than AI companies, or even that of the rigorous safety standards placed on pharmaceuticals or aviation companies, which protect public health. The UK cannot afford to fall behind on this.
I do not want to dwell on doom and gloom; there is hope. The European Union, California and New York are leading the way on strong AI governance. The EU AI Act establishes a risk-based comprehensive regulatory framework. California is advancing detailed standards on system evaluations and algorithmic accountability, and New York has pioneered transparency and bias-audit rules for automated decision making. Those approaches show that democratic nations can take bold, responsible action to protect their citizens while fostering innovation.
We in the UK are fortunate to have a world-leading ecosystem of AI safety researchers. The UK AI Security Institute conducts essential work testing frontier models for dangerous capabilities, but it currently relies on companies’ good will to provide deployment action.
We stand at a threshold of an era defined by AI. Our responsibility as legislators is clear: we cannot afford complacency, nor can we allow the UK to drift into a position in where safety, transparency and accountability are afterthoughts, rather than foundational principles. The risk posed by advanced AI systems to our economy, our security and our very autonomy are real, escalating and well documented by the world’s leading experts. The United Kingdom has the scientific talent, the industrial capacity and the democratic mandate to lead in safe and trustworthy AI, but we lack the legislative framework to match that ambition. I urge the Government to urgently bring forward an AI Bill as a cross-party endeavour, and perhaps even set up a dedicated Select Committee for AI, given how serious the issue is.
I thank the hon. Gentleman—a fellow engineer—for allowing this intervention. As the Chair of the Science, Innovation and Technology Committee—a number of fantastic Committee members are here—I would like to say that we have already looked at some of the challenges that AI presents to our regulatory infrastructure and our Government. Last week, we heard from the Secretary of State, who assured us that where there is a legislative need, she will bring forward legislation to address the threats posed by AI, although she did not commit to an AI Bill. We are determined to continue to hold her to account on that commitment.
Iqbal Mohamed
I thank the hon. Lady for her intervention, and I am grateful for the work that her Select Committee is doing, but I gently suggest that we need representatives from all the other affected Select Committees, covering environment, defence and the Treasury, because AI will affect every single function of Government, and we need to work together to protect ourselves from the overall, holistic threat.
Each of the Select Committees is looking at AI, including the Defence Committee, which has looked at AI in defence. AI impacts every single Department and security on cross-governmental issues. Although we are not talking about the process of scrutiny, we all agree that scrutiny is important.
Iqbal Mohamed
I am glad to hear that.
If the United States and China race to build the strongest systems, let Britain be the nation that ensures the technology remains safe, accountable and under human control. That is a form of leadership every bit as important as engineering, and it is one that our nation is uniquely placed to deliver. This moment will not come again. We can choose to shape the future of AI, or we can wait for it to shape us. I believe that this country still has the courage, clarity and moral confidence to lead, and I invite the Government to take on that leadership role.
(4 months, 1 week ago)
Commons ChamberThank you, Madam Deputy Speaker. It is a great pleasure to speak to you on this occasion to welcome the ambition behind the life sciences innovation manufacturing fund and, indeed, the Government’s positive support for life sciences, with their belief that Government can act to support industry in general; it is not simply a matter of getting out of the way. That is in sharp contrast to the last Conservative Government’s approach to industry, allowing a gentle decline and deindustrialisation in our nation. To be fair, the series of Conservative Governments chopped and changed their approach to industrial strategy so often it was difficult to know exactly where they stood. Unlike them, Labour is committed to the life sciences sector.
Labour published its plan for life sciences in opposition, which included 10-year funding commitments for key research bodies aimed at putting an end to the short-termism that undermines economic growth and scientific success. Now in government, I welcome Labour’s commitment to the life sciences sector plan—developed in close co-ordination with the Government’s 10-year health plan—which aims to support cutting-edge research and turn that into real-world results, with new treatments, faster diagnoses and more lives saved. It is about making sure that breakthroughs happen here in this country, creating jobs, improving lives in every part of the country and driving growth.
As the Minister said, the life sciences are a strength of our country—they are often described as a jewel in the crown of the British economy—and we all know that success in life sciences leads to positive, wide-reaching benefits across the country for the economy and our health.
Sorcha Eastwood (Lagan Valley) (Alliance)
You mentioned the sector’s relevance and benefit to the whole of the United Kingdom. Would you agree that Northern Ireland has a rich manufacturing and life sciences heritage and that we have a huge role to play?
Let me thank the hon. Member for that intervention, which pre-empts something I will say in a few minutes. She is absolutely right: Northern Ireland already plays an important role in the life sciences sector and life sciences manufacturing, and it will have an important role to play in the future.
It is an incredibly exciting time to be involved in life sciences. I often think that if I were a young engineer now—I studied electrical engineering—I would be fascinated by the life sciences and, in particular, synthetic biology, which offers so many potential opportunities for growth and wellbeing. It is an enabling technology across so many different sectors.
In Newcastle, including in my constituency of Newcastle upon Tyne Central and West, the life sciences contribute £1.7 billion and employ over 8,000 people across more than 200 companies. We are home to the National Innovation Centre for Ageing, Newcastle Helix and The Biosphere. Our city is one star in a constellation of excellent life sciences clusters across the north of England.
I really welcome the ambition of the innovation manufacturing fund. I ask the Minister in his response for more clarity in three particular areas. First, in regard to the size of the fund, in the face of increased competition, and as the shadow Secretary of State described—this will be in less sensationalist terms—we are seeing some reduction in investment in the UK. Is £520 million enough to ensure that the UK is an attractive prospect for internationally mobile businesses? By contrast, a manufacturing plant such as Moderna’s recently opened vaccine centre in Oxfordshire might cost in the region of £150 million to £200 million. Is the fund the right size?
Secondly, the Select Committee recently held a one-off session on life sciences investment, which was of such interest that we have decided to hold another one-off session next week on the same subject. We heard evidence from the pharma sector, including significant support for the life sciences sector plan and for the Government’s approach, but I think it is fair to say that we were told that, although NHS pricing is not the only factor in investment decisions, it is a significant one. We heard evidence that the UK spends less proportionately on medicines than other comparable countries and that that reduces the pull-through for innovative medicines. It would clearly be a difficult decision to spend more on medicines, as that would mean spending less elsewhere in our NHS.
Does the Minister see the manufacturing fund as support in some way for investment decisions in the absence of progress on the NHS pricing discussions? Could he tell us whether the Secretary of State is involved in discussions between the Health Secretary and the pharma sector with regard to NHS pricing? I understand that discussions are ongoing, and I see the Under-Secretary of State for Health and Social Care, my hon. Friend the Member for Glasgow South West (Dr Ahmed), conferring with him. Perhaps he can confirm that those discussions are ongoing.
Lincoln Jopp (Spelthorne) (Con)
When the Committee held its one-off session on investment in life sciences, did it unearth the reasons why Sanofi, Eli Lilly and Merck have recently chosen to disinvest in life sciences in the UK?
I thank the hon. Member for that intervention. The Committee’s work is fascinating, so I certainly recommend he read the transcript. To summarise, we were looking specifically at the reasons for investment being pulled and, as I said, we asked the question in a number of different ways. The message that came back was significant support for the life sciences sector plan and the Government approach, but lack of certainty and clarity over NHS pricing and dismay about some aspects of NHS pricing and National Institute for Health and Care Excellence decisions. The hon. Gentleman is therefore right to point out that there was concern over the current and likely future pricing of innovative medicines, but that was not the only factor in those investment decisions. I ask the Minister to give us an update on those negotiations to the extent that he is able to do so, and to say whether this manufacturing fund is seen as potential compensation for investment in medicines and pricing as part of the NHS future plan.
My hon. Friend is making some interesting points about investment decisions. Has her Committee also investigated why some decisions have been made to bring investment into the UK, such as the recent decision about investment in Oxfordshire? As part of that, is there a parallel need to explore where more could be done to attract further investment through perhaps greater supply of trained workers, better transport, better access to land for development, and so on?
My hon. Friend makes an excellent point. My Committee has looked at some of the reasons for investments, such as those he sets out, and it is worth emphasising the strengths of the UK, some of which I have mentioned. We have a really strong life sciences sector, and specifically skills at every stage in the UK life sciences ecosystem, together with R&D tax credits, which is another point of incentivisation, and the fact that our NHS offers a fantastic opportunity to test and trial new medicines with a population that is heterogeneous and with population data records that are second to none. So there are many reasons why pharma and life sciences companies are continuing to invest in our country, and we have a fantastic ecosystem of life sciences start-ups and scale-ups.
That brings me to the final question I want to put to the Minister, which is on the regional impact of the fund. The Minister mentioned on a number of occasions that the fund will drive investment and growth across our country. As part of the Committee’s inquiry into innovation and regional growth, we heard of significant disparities in investment, particularly in access to capital and research funding from UK Research and Innovation and in funding and investment between the regions of our country and the greater south-east, otherwise known as the golden triangle. Manufacturing is well distributed across the United Kingdom; we heard earlier about the opportunities in Northern Ireland. Can the Minister tell me whether there will be a regional dimension to how the funds are disbursed? I hope that the extent to which the funds are regionally distributed will be monitored, but does he expect that this funding will be distributed across the country to drive growth in every corner of the country as he said, and that it will not perpetuate existing regional inequalities?
I call the Liberal Democrat spokesperson.
(4 months, 2 weeks ago)
Commons ChamberThe Secretary of State is absolutely right to champion access to a consistent, trusted digital ID. All of us online have digital IDs aplenty already—Facebook, TikTok, His Majesty’s Revenue and Customs, Tesco—so she is right to bring the benefits of one digital ID to my constituents. But making digital ID mandatory for everyone seeking work is poking a stick in the eye of all those with security, privacy and/or Government capacity concerns, which my Committee will be examining as part of our work on digital government. For now, though, can she first confirm that people will be in control of their digital ID data and who accesses it? Secondly, will she say whether it will be procured externally from the private sector or developed in-house by Government digital services?
My hon. Friend is right to raise the important issues of security—people are rightly concerned about the security of their data, and that is why that will be at the heart of our consultation. In answer to her specific questions: yes, people will control who sees and accesses their data, and we absolutely expect this system to be designed and built within Government, building on the One Login.
(8 months, 1 week ago)
Commons ChamberI call the Chair of the Science, Innovation and Technology Committee.
AI is already prevalent in the workplace and in the education system, and we need to equip the next generation to be able to use AI tools productively and securely while also delivering on their unique potential as human beings. How is the Minister working with the Department for Education to ensure that the AI tools that are used in our education system support this kind of learning? Specifically, what advice has she given to the Department with regard to the procurement of edtech tools, which are widely available? Some are free and some need to be paid for, so how are schools to decide which to use?
As I have said, I work very closely with my counterparts in the Department for Education. Earlier this year, we launched safe standards for the sector and provided guidance on how to safely develop AI tools for education. The DFE has also provided guidance to schools on how to safely use AI in schools. That work is ongoing. As I have said, we are working both with the sector and with educators to make sure that we get this right.
(8 months, 3 weeks ago)
Commons ChamberWith the leave of the House, Madam Deputy Speaker, I shall make a few comments, because it is important to respond to some of the questions that have been asked. Two of my hon. Friends referred to the report that the BFI published yesterday. I warmly commend it to all Members, not least because it makes points that others have made about AI, but also because it makes the point that if films and high-end television in the UK are to be successful in the future, we cannot have this critical shortfall in AI education, which is entirely piecemeal at the moment. We know about that in the Department, and it is one of the things that we want to change.
Several Members have asked who will be involved in the various different groups. I want to draw on all the expertise in both Houses to ensure that we can find the right answers. I do not want to undermine anything that the Select Committees might do, jointly or separately, and like my hon. Friend the Member for Bury North (Mr Frith), I am keen for all the parts of the creative industries to engage in this process. The difficulty is that we might end up with a very large roundtable, and people might have to bear with us when it comes to how we structure that.
I apologise for not being here earlier. I commend the Government for engaging in a cross-party discussion about AI, which is what the country needs to do, but the key issue is ensuring from the beginning that the tech companies understand that transparency in copyright and AI is not a “nice to have” but an absolute requirement, and that if they will not deliver it, the Minister will.
We have said from the very beginning that transparency is absolutely key to our ability to deliver the package that we would like to put together, and I do not resile from that, but it is only one part of the jigsaw that we need to join up.
I point out to the hon. Member for Gosport (Dame Caroline Dinenage) that some of the items on the amendment paper are things that the two Select Committees asked us to do. She is normally more generous to me, and to others, than she has been today. She has clearly forgotten that the last Government introduced plans that would have produced a text and data mining exemption for commercial exploitation of copyrighted materials without any additional protections for the creative industries. That seems to have slipped her mind.
We have moved a great deal since the introduction of the Bill. The Secretary of State for Culture, Media and Sport, the Secretary of State for Science, Innovation and Technology—who is sitting beside me—and I have moved. We have listened to their lordships, and, more importantly, we have listened to what the creative industries have had to say. The hon. Member for Perth and Kinross-shire (Pete Wishart) asked me whether I had ever known anything like this situation. Other bills have gone to five rounds of ping-pong, but in the past the row has always been about what is in the Bill, not what is not in the Bill. This is not an AI Bill, and it will not change the copyright regime in this country. I want that regime to be as robust as it ever has been, so that those in the creative industries can be remunerated and earn a living, as they deserve to. That is precisely what we intend to achieve, but we want to get the Bill on the statute book as soon as possible. That is why I need the House to vote with us this afternoon, and I hope that their lordships will agree with us tomorrow.
Question put.
(9 months, 2 weeks ago)
Commons ChamberI call the Chair of the Science, Innovation and Technology Committee.
I really welcome the US-UK trade deal and the fact that the Secretary of State and the Prime Minister kept their commitment not to put online safety on the table in those negotiations. My Committee’s inquiry into social media misinformation and algorithms has heard evidence that the algorithms in social media drive the spread of misinformation, and we saw the consequences of that in the summer riots. Will the Secretary of State confirm that, as well as not watering down the Online Safety Act, he will look to strengthen it and is discussing how to do so with our allies in the US?