Online Harm: Child Protection

Emily Darlington Excerpts
Tuesday 24th February 2026

(2 days, 20 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - -

This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day, which is just appalling. We know the suffering and pain caused by seeing images tagged with the terms “ana”, “thinspiration” and other terms that should go. The promotion of such content is now a category 1 offence, and Ofcom should be weeding it out. The hon. Member for Winchester (Dr Chambers) is absolutely right to say that that measure should be extended to bots.

I thank the Chair of the Science, Innovation and Technology Committee, my hon. Friend the Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), for her fantastic speech. We have taken this matter seriously since the very beginning of the parliamentary Session, and we have done a lot of work on it. I echo her call for Ministers to look again at the recommendations in our Committee’s “Social media, misinformation and harmful algorithms” report, which goes well beyond misinformation and into how the damage is done.

Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. In the spirit of consultation—I know that we will get to that—I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, what will surprise the House is what young people consider social media profiles to be. We consider them to be Facebook or Instagram, while they consider them to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. Let me remind the House that, at that age, the brain development of young women is close to finished, while for young men, whose brain development does not finish until they are about 25, it is nowhere near complete. We know that from the science—just to be clear, that is not an opinion. Brain development in young women and girls happens differently, so should we therefore have different rules for young women and men?

Fifty-nine per cent of the 14 to 16-year-olds have been contacted by strangers, and more than a third of that was through Roblox, which is not covered by the Australian social media ban. Thirty-three per cent have been bullied, and a third of those was on Roblox. The Australian social media ban—which I assume is what the Liberal Democrats are talking about when they say they are in favour of a ban—does not cover YouTube or Roblox, and we have not even looked at whether it is effective. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. The Online Safety Act was a missed opportunity. It also took seven years to get through this House, but we do not have seven years to wait.

There would also be unintended consequences to a ban. I had the pleasure of meeting Ian Russell the other night, and we had a really powerful discussion. My heart goes out to him, as one parent to another, given what his family have been through. He does not jump to the easy solution of a social media ban. The Molly Rose Foundation has done a brilliant briefing paper, which every MP should read, about why it does not support a ban: it wants the online world to be safe for children, but a ban does not make it so.

Matt Rodda Portrait Matt Rodda
- Hansard - - - Excerpts

My hon. Friend is making an excellent speech. I commend her work in reaching out to young people; it sounds superb. The lesson may be that we should all do exactly that. I am running a survey myself. She mentioned the Molly Rose Foundation, and I have met some of its staff to discuss its work. A family in my constituency of Reading suffered a terrible incident—their son was murdered in an incident of online bullying—and they have a different view. Does my hon. Friend agree that it is important that we properly listen to the families and consider the different views in the consultation?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I absolutely do. My full sympathy goes to that family in my hon. Friend’s constituency—it is the worst thing in the world for a parent to lose a child. But we have to get this right, which is why it is right that we have a consultation. It does no child any good if we jump to a conclusion that does not actually protect children.

Although I maintain an open mind, I worry about a full ban. Some children rely on social media for connection, often including those who are exploring their sexuality—LGBTQ+ people—and those who are neurodivergent. The consequences for them could be devastating, so we need to consider their views. If young people get around the ban, as they do in Australia, they are less likely to report when they see harmful content or are being targeted on social media, because they worry that they will get in trouble for breaking the law.

A ban would create a cliff edge at 16. No matter the person’s maturity—I have already talked about the different brain development in young women and men—their skills or what they have been taught, there is a cut-off at 16. All of a sudden it does not matter, and they go into a world that is not safe. Younger children do not have their own social media profiles; they use their parents’ devices. Often, they start with a video of Peppa Pig, and all of a sudden—who knows where it ends up? A ban would not address that. So, what is the solution? Doing nothing is not an option—I think the whole House can agree on that.

Monica Harding Portrait Monica Harding
- Hansard - - - Excerpts

I was interested in the hon. Member’s survey. I have done my own very unscientific survey of young people, and all of them seem to want some form of regulation. With that in mind, we must hurry up—does the hon. Member agree?

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - -

I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - - - Excerpts

A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.

Emily Darlington Portrait Emily Darlington
- Hansard - -

That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.

Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.

As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?

We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.

We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.

Cyber Security and Resilience (Network and Information Systems) Bill (Seventh sitting)

Emily Darlington Excerpts
David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

I think this once more comes down to state capacity and how we see the state’s role. Clearly there needs to be an expansion of the state’s powers—that is why the Bill was introduced—to mandate in writing various requirements of the companies that provide the critical infrastructure upon which our country relies. The hon. Member will remember the numerous witnesses who told us that board accountability was crucial. Some told us that in public and some in private. They are the people who are doing this job, and whom the Government are asking to do this job. That is why we should listen to them and why we will press the new clauses to a vote.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

The new clauses raise a really important point about security by design implemented within companies, and within the companies that provide cyber-security technology to them. An hon. Friend of mine tabled an amendment, which we are not speaking about today, on a similar subject.

Security and safety by design is something that we talk about quite often in this area. It may not be appropriate for this Bill, but I am keen to hear how we will progress those discussions, because ultimately we do want to prevent cyber-attacks. We need to make sure that companies, small and medium-sized enterprises, major infrastructure and local government all have access to technology and infrastructure that looks at security by design in its own design right from the outset, because that is what makes us most secure.

How will we take forward those discussions, and extend the idea that already exists in legislation, through the Online Safety Act 2023, about safety by design, in order to ensure that products around cyber-security have this at their heart, and deliver the prevention mechanism that I think we all want to see—especially the small businesses and organisations that are victims of such attacks?

Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

New clause 16 would require active board oversight of security and resilience measures and accountability for board members where they fail in those oversight duties, whereas new clause 17 would require regulated entities to carry out proportionate, periodic testing of the security and resilience of their network and information systems, and provide the results to regulatory bodies upon request.

On board accountability, as we have already discussed in this Committee, the existing regulatory model under NIS regulations has not been sufficiently effective in driving up cyber-resilience standards to meet emerging threats. Board engagement is a key part of that, but the stat I quoted previously in this Committee indicates that engagement is going in the wrong direction. What assessment has the Minister made of the potential advantages and disadvantages of direct accountability in the adoption of effective cyber-resilience measures, based on a roll-out of the NIS2 regulations?

Proportionate testing of systems may be a useful tool in detecting and managing cyber-security risk. What consideration has the Minister’s Department given to how that topic should be approached in the Secretary of State’s code of practice?

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I thank the hon. Member for his point. I am also aware that the National Cyber Security Centre’s cyber assessment framework has very specific measures on appropriate testing as well. It already exists, and we want to make sure that it is an important part of specific security and resilience requirements in secondary legislation.

It is crucial that industry is consulted on the nature of any requirements related to testing. As mentioned, we intend to consult on the proposals later in the year. We will also issue a statement of strategic priorities for regulators, and will explore whether that is an appropriate vehicle for driving consistency in the behaviours of regulators in respect of their approach to testing for their sector.

Overall, any approach to going further on proportionate and regular testing must be developed alongside the full set of security and resilience requirements, and co-ordinated and communicated with a wider package of implementing measures. That will allow the impact of options to be assessed, and provide the industry with clarity on the overall approach, including how the components fit together.

The shadow Minister asked about the consideration of NIS2 requirements. We have looked at NIS2 provisions, and variability in member states’ implementation of it, as part of a wider set of considerations on which we will be consulting regarding secondary legislation on governance.

My hon. Friend the Member for Milton Keynes Central made an incredibly important point about security by design, which I very much take into account. The Government Digital Service is already working on a secure by design standard. We want to make sure that it is as robust as possible, and extend it across not just the public sector but parts of the private sector. I will make sure that security by design remains at the heart of the Government’s cyber action plan, as well as that of the private sector.

Emily Darlington Portrait Emily Darlington
- Hansard - -

I thank the Minister for that commitment. Would he consider setting up a meeting between GDS and those MPs who have expertise in this area, so that we can share our expertise and reassure ourselves that this is going in the right direction and at the speed that is necessary?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

My hon. Friend has extensive expertise, from which I benefit extensively. I will be keen to make sure that the Government Digital Service does so too.

In the light of those commitments, I kindly ask the hon. Member for Brecon, Radnor and Cwm Tawe not to press the new clauses.

Cyber Security and Resilience (Network and Information Systems) Bill (Fifth sitting)

Emily Darlington Excerpts
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

I have a few questions for the Minister. I appreciate the clarity that the Bill brings to many of the services in its scope. I would like to understand how the definition of “incidents” will relate to hardware vulnerabilities that are discovered within a company, as we heard from some of the people who gave evidence to the Committee. It is unclear in the Bill. Perhaps it will be further defined in secondary legislation.

I want to understand how an incident in which someone discovers a vulnerability in hardware—such as in a system-in-package—is reported, and how that information is then delivered by the regulator to other companies in the sector that may have similar technology, and to the other regulators, which may also want to flag that technology as a particular vulnerability. Is that defined as an “incident” or is it defined somewhere else in the Bill? I am a bit confused and am looking for some clarity.

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

Having been promoted from a position of mere confidence to faith, I will tackle questions from the hon. Member for Runnymede and Weybridge first and foremost. On the question of thresholds of incident, the Bill sets out the severity of the sorts of incidents that we expect reporting obligations to apply to, and at the same time it ensures that it is proportionate in understanding that sector-specific thresholds ought to be precisely that—sector specific, set closely with relevant entities in that sector, and working with the expertise of the relevant regulators. For that reason, it has not been specified more fully on the face of the Bill.

On information sharing, not only is there provision for the specific sets of purposes for which information sharing ought to take place between regulators, but there is a further check on the proportionality of that, through a particular requirement, to ensure that information that is shared in incident contexts is done precisely for the purposes set out in the Bill, and in a way that is proportionate.

My hon. Friend the Member for Milton Keynes Central raised the question of hardware impacts. While the focus of the Bill is primarily on network and information systems, the test, as I think of it, would look at whether any compromise in network and information systems related to a piece of hardware triggers the severity of the impact, or potential impact, to be reportable. In the event that it is reportable, in its severity and potential impact, it will require notification—to the regulator and, when customers are directly impacted in the way that is set out in the Bill, also to the customers. The test is focused on whether network and information systems are engaged, and whether the impact of any incident is likely to be severe enough, in light of the thresholds set out in the Bill.

--- Later in debate ---
Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

Clause 18, which the Government seek to modify through amendments 14 to 18, creates new pathways for information sharing between regulators, public authorities and Government Departments. It also creates a power for NIS enforcement authorities to share information with relevant overseas authorities for specified purposes. The new regime is intended to remove gaps and ambiguities in the existing framework governing the sharing of information obtained in the course of competent authorities and the oversight role of NCSC, and to create legal certainty in this domain.

In turn, it is anticipated that greater information sharing will assist with the detection of crime, enforcement activity and awareness of emerging cyber-risks and with ascertaining the effectiveness of the NIS regulations in building UK cyber-resilience. In particular, the Bill creates a new gateway to ensure that NIS regulators can share information with UK public authorities, and vice versa, as well as sharing and receiving information from organisations outside of the NIS framework, for example other regulators or bodies such as Companies House.

The Bill strengthens safeguards on how information can be used once it has been shared under the NIS regulations by restricting onward disclosure. More effective information sharing will be vital for competent authorities to keep up to date with emerging risks and building resilience in their sectors, and the new measures were broadly welcomed by regulators in our oral evidence session.

However, industry bodies such as techUK have called for further detail on the new information-sharing regime. What steps are the Government taking to ensure that regulators share responsibility for protecting sensitive data, and that information-sharing processes are coherent, proportionate and secure? Could the Minister elaborate on the discussions he has had with regulators on those matters, and on how secure information sharing will work in practice?

Finally, on the detail of the text in Government amendment 14, proposed new paragraph (aa)(ii) refers to persons

“otherwise in connection with…any other matter relating to cyber security and resilience,”.

Given that this is an information-sharing power, that seems a remarkably broad “any other matter” provision. What disclosures that are not already covered in the Bill does the Minister conceive will come up in that scope? What guidance or consultation will the Minister produce to make sure that such powers are proportionate and not at risk of abuse?

Emily Darlington Portrait Emily Darlington
- Hansard - -

Again, I welcome the Government amendments and clause 18; they are important to enabling us to share our vulnerabilities in an appropriate way with those people who may be involved. However, some of the aspects of those vulnerabilities that security services—GCHQ, His Majesty’s Government Communications Centre and others—raised with us relate particularly to not only foreign interference, but the potential for interference through technology embedded in our networks. How does the Minister see the measures working within our co-operation with different foreign nations, particularly during these volatile times?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

In response to the shadow Minister’s first question about ensuring sensitive handling of shared information and proportionality, all information handled by regulators ought to be treated carefully and with awareness of its importance. The regulators have to act reasonably, and the NIS regulations specifically require information obtained from inspections to be held securely. Of course, data protection laws apply to regulators as well. Alongside that, regulators will be required to consider the relevance and proportionality of sharing their information to the purposes set out in the Bill; as I have mentioned, the Bill includes specific purposes for why information might be shared.

Cyber Security and Resilience (Network and Information Systems) Bill (Sixth sitting)

Emily Darlington Excerpts
Ben Spencer Portrait Dr Spencer
- Hansard - - - Excerpts

Clause 43 grants the Secretary of State powers to issue directions to regulate entities where there is a risk to national security, or where an action must be taken in the interests of national security. Directions can include requirements relating to the management of systems, the yielding of information and the removal or modification of goods and services. The Secretary of State may also require a regulated entity to engage the services of a skilled person to comply with directions issued. The Secretary of State has wide discretion to dispense with providing reasons for directions or consulting with the affected parties on the basis of national security considerations.

Clause 44 clarifies that the Secretary of State’s directions under part 4 prevail if there is a conflict between those directions and another statutory requirement. The exercise of these powers by the Secretary of State could have far-reaching consequences for businesses, which may experience interruption to their commercial activities, as well as the potentially considerable time and expense in adhering to a request made on national security grounds.

I have spoken on several occasions in the House and in this Committee about the critical risks posed to our cyber-security and national security by hostile state actors and their affiliates. It is, of course, right that the Secretary of State should have this power, but it should be used only in extremis. Like other extensive powers granted to the Secretary of State under part 3, it must be subject to oversight and guardrails. A report to Parliament, which may well be redacted, on the exercise of functions under part 4 will not be sufficient to ensure that this power is used proportionately. Has the Department considered introducing an obligation for the Secretary of State to report to the Intelligence and Security Committee when she exercises powers under part 4?

We discussed the Chinese super-embassy earlier. Later in the Committee’s proceedings, I will talk about an Opposition new clause that would deal with that problem effectively.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

As the Minister will be aware, I have spoken consistently of my concern about our reliance on hardware and tech that comes from potentially non-favourable state actors abroad. That also relates to Government procurement, which I have raised before, as the Minister will know.

The Committee has already discussed how local government and Government Departments are not covered by this legislation, and how there is a separate strategy and document. Can the Minister expand on how protections against a reliance on foreign tech within critical infrastructure, in either the private or the public sector, are being dealt with in the Bill or in the strategy that has been published for the public sector? How will that be continually reviewed as our global geopolitical situation remains unstable?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I will start by addressing amendment 27, moved by the hon. Member for Brecon, Radnor and Cwm Tawe, which would add to the non-exhaustive list of requirements that could be included in a national security direction. It specifies that a direction could include requirements to

“remove, disable or modify hardware, software or other facilities”.

I reassure him that the Bill, as currently drafted, allows the Secretary of State to impose those types of requirements. Clause 43(3)(f) specifies that a direction may include

“a requirement relating to removing, disabling or modifying goods or facilities or modifying services”.

That already encompasses the types of requirements specified in amendment 27.

Furthermore, clause 43(3) lists the requirements that may “in particular” be included in a direction. The list is therefore not exhaustive, and for good reason. It is not possible or desirable to specify every action that might be needed to address a national security risk. That would restrict the Government’s potential avenues to address urgent national security threats, and would risk the legislation being too narrow to address novel threats to the UK’s national security.

Cyber Security and Resilience (Network and Information Systems) Bill (First sitting)

Emily Darlington Excerpts
Kanishka Narayan Portrait The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
- Hansard - - - Excerpts

Q Thank you very much to both of you for your insights today. The question on my mind is related, in part, to the point that Jen raised. There are a range of levers at the Government’s disposal in thinking about and acting on cyber-security. I am interested in your thoughts on which parts of the economy ought to be in the scope of regulation and legislative measures, and where effective measures that sit outside of regulation and legislation—guidance being one from a range of non-regulatory measures—would be better suited.

Jen Ellis: Again, that is a hugely complex question to cover in a short amount of the time. One of the challenges that we face in UK is that we are a 99% small and mediums economy. It is hard to think about how to place more burdens on small and medium businesses, what they can reasonably get done and what resources are available. That said, that is the problem that we have to deal with; we have to figure out how to make progress.

There is also a challenge here, in that we tend to focus a lot on the behaviour of the victim. It is understandable why—that is the side that we can control—but we are missing the middle piece. There are the bad guys, who we cannot control but who we can try to prosecute and bring to task; and there are the victims, who we can control, and we focus a lot on that—CSRB focuses on that side. Then there is the middle ground of enablers. They are not intending to be enablers, but they are the people who are creating the platforms, mediums and technology. I am not sure that we are where we could be in thinking about how to set a baseline for them. We have a lot of voluntary codes, which is fantastic—that is a really good starting point—but it is about the value of the voluntary and how much it requires behavioural change. What you see is that the organisations that are already doing well and taking security seriously are following the voluntary codes because they were already investing, but there is a really long tail of organisations that are not.

Any policy approach, legislation or otherwise, comes down to the fact that you can build the best thing in the world, but you need a plan for adoption or the engagement piece—what it looks like to go into communities and see how people are wrestling with this stuff and the challenges that are blocking adoption. You also need to think about how to address and remove those challenges, and, where necessary, how to ensure appropriate enforcement, accountability and transparency. That is critical, and I am not sure that we see a huge amount of that at the moment. That is an area where there is potential for growth.

With CSRB, the piece around enforcement is going to be critical, and not just for the covered entities. We are also giving new authorities to the regulators, so what are we doing to say to them, “We expect you to use them, to be accountable for using them and to demonstrate that your sector is improving”? There needs to be stronger conversations about what it looks like to not meet the requirements. We should be looking more broadly, beyond just telling small companies to do more. If we are going to tell small companies to do more, how do we make it something that they can prioritise, care about and take seriously, in the same way that health and safety is taken seriously?

David Cook: To achieve the outcome in question, which is about the practicalities of a supply chain where smaller entities are relying on it, I can see the benefit of bringing those small entities in scope, but there could be something rather more forthright in the legislation on how the supply chain is dealt with on a contractual basis. In reality, we see that when a smaller entity tries to contract with a much larger entity—an IT outsourced provider, for example—it may find pushback if the contractual terms that it asks for would help it but are not required under legislation.

Where an organisation can rely on the GDPR, which has very specific requirements as to what contracts should contain, or the Digital Operational Resilience Act, which is a European financial services law and is very prescriptive as to what a contract must contain, any kind of entity doing deals and entering into a contract cannot really push back, because the requirements are set out in stone. The Bill does not have a similar requirement as to what a contract with providers might look like.

Pushing that requirement into the negotiation between, for example, a massive global IT outsourced provider and a much smaller entity means either that we will see piecemeal clauses that do not always achieve the outcomes you are after, or that we will not see those clauses in place at all because of the commercial reality. Having a similarly prescriptive set of requirements for what that contract would contain means that anybody negotiating could point to the law and say, “We have to have this in place, and there’s no wriggle room.” That would achieve the outcome you are after: those small entities would all have identical contracts, at least as a baseline.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

Q I want to go back to basics and get a bit of insight from you. What cyber risks are businesses currently facing, and how do you feel the Bill addresses those risks?

David Cook: The original NIS regulations came out of a directive from 2016, so this is 10 years old now, and the world changes quickly, especially when it comes to technology. Not only is this supply chain vulnerability systemic, but it causes a significant risk to UK and global businesses. Ransomware groups, threat actors or cyber-criminals—however you want to badge that—are looking for a one-to-many model. Rather than going after each organisation piecemeal, if they can find a route through one organisation that leads to millions, they will always follow it. At the moment, they are out of scope.

The reality is that those organisations, which are global in nature, often do not pay due regard to UK law because they are acting all over the world and we are one of many jurisdictions. They are the threat vector that is allowing an attack into an organisation, but it then sits with the organisations that are attacked to deal with the fallout. Often, although they do not get away scot-free, they are outside legislative scrutiny and can carry on operating as they did before. That causes a vulnerability. The one-to-many attack route is a vulnerability, and at the moment the law is lacking in how it is equipped to deal with the fallout.

Jen Ellis: In terms of what the landscape looks like, our dialogue often has a huge focus on cyber-crime and we look a lot at data protection and that kind of thing. Last year, we saw the impact of disruptive attacks, but in the past few years we have also heard a lot more about state-sponsored attacks.

I do not know how familiar everyone in the room is with Volt Typhoon and Salt Typhoon; they were widespread nation-state attacks that were uncovered in the US. We are not immune to such attacks; we could just as easily fall victim to them. We should take the discovery of Volt Typhoon as a massive wake-up call to the fact that although we are aware of the challenge, we are not moving fast enough to address it. Volt Typhoon particularly targeted US critical infrastructure, with a view to being able to massively disrupt it at scale should a reason to do so arise. We cannot have that level of disruption across our society; the impacts would be catastrophic.

Part of what NIS is doing and what the CSRB is looking to do is to take NIS and update it to make sure that it is covering the relevant things, but I also hope that we will see a new level of urgency and an understanding that the risks are very prevalent and are coming from different sources with all sorts of different motivations. There is huge complexity, which David has spoken to, around the supply chain. We really need to see the critical infrastructure and the core service providers becoming hugely more vigilant and taking their role as providers of a critical service very seriously when it comes to security. They need to think about what they are doing to be part of the solution and to harden and protect the UK against outside interference.

David Cook: By way of example, NIS1 talks about reporting to the regulator if there is a significant impact. What we are seeing with some of the attacks that Jen has spoken about is pre-positioning, whereby a criminal or a threat actor sits on the network and the environment and waits for the day when they are going to push the big red button and cause an attack. That is outside NIS1: if that sort of issue were identified, it would not be reportable to the regulator. The regulator would therefore not have any visibility of it.

NIS2 and the Bill talk about something being identified that is caused by or is capable of causing severe operational disruption. It widens the ambit of visibility and allows the UK state, as well as regulators, to understand what is going in the environment more broadly, because if there are trends—if a number of organisations report to a regulator that they have found that pre-positioning—they know that a malicious actor is planning something. The footprints are there.

Freddie van Mierlo Portrait Freddie van Mierlo (Henley and Thame) (LD)
- Hansard - - - Excerpts

Q I want to take a step back and ask a broader question about why this legislation is necessary. I think we agree that it is, but why are companies not already adhering to very high cyber-security standards? Surely it is in their commercial interests to do so; last year we saw the massive impact on JLR, M&S and the Co-op of failing to do so. Why might the state need to mandate companies to be cyber-secure and make them cyber-secure?

Jen Ellis: You have covered a lot of territory there; I will try to break it down. If you look at the attacks last year, all the companies you mentioned were investing in cyber-security. There is a difficulty here, because there is no such thing as being bullet-proof or secure. You are always trying to raise the barriers as high as you can and make it harder for attackers to be successful. The three attacks you mentioned were highly targeted attacks. The example of Volt Typhoon in the US was also highly targeted. These are attackers who are highly motivated to go after specific entities and who will keep going until they get somewhere. It is really hard to defend against stuff like that. What you are trying to do is remove the chances of all the opportunistic stuff happening.

So, first, we are not going to become secure as such, but we are trying to minimise the risk as much as possible. Secondly, it is really complex to do it; we saw last year the examples of companies that, even though they had invested, still missed some things. Even in the discussions that they had had around cyber-insurance, they had massively underestimated the cost of the level of disruption that they experienced. Part of it is that we are still trying to figure out how things will happen, what the impacts will be and what that will look like in the long term.

There is also a long tail of companies that are not investing, or not investing enough. Hopefully, this legislation will help with that, but more importantly, you want to see regulators engaging on the issue, talking to the entities they cover and going on a journey with them to understand what the risks are and where they need to get to. If you are talking about critical providers and essential services, it is really hard for an organisation—in its own mind or in being answerable to its board or investors—to justify spend on cyber-security. If you are a hospital saying that you are putting money towards security programmes rather than beds or diagnostics, that is an incredibly difficult conversation to have. One of the good things about CSRB, hopefully, is that it will legitimise choices and conversations in which people say, “Investing time and resources into cyber-security is investing time and resources into providing a critical, essential service, and it is okay to make those pay-off choices—they have to be made.”

Part of it is that when you are running an organisation, it is so hard to think about all the different elements. The problem with cyber-security—we need to be clear about this—is that with a lot of things that we ask organisations to do, you say, “You have to make this investment to get to this point,” and then you move on. So they might take a loan, the Government might help them in some way, or they might deprioritise other spending for a set period so that they can go and invest in something, get up to date on something or build out something; then they are done, and they can move back to a normal operating state.

Security is not that. It is expensive, complex and multifaceted. We are asking organisations of all sizes in the UK, many of which are not large, to invest in perpetuity. We are asking them to increase investment over time and build maturity. That is not a small ask, so we need to understand that there are very reasonable dynamics at play here that mean that we are not where we need to be. At the same time, we need a lot more urgency and focus. It is really important to get the regulators engaged; get them to prioritise this; have them work with their sectors, bring their sectors along and build that maturity; and legitimise the investment of time and resources for critical infrastructure.

--- Later in debate ---
Chris Vince Portrait Chris Vince
- Hansard - - - Excerpts

Q Thank you for coming along. Chris has touched on this already, but the Government’s impact assessment of the Bill said that the UK was falling behind its international partners. You all have experience of working globally. Could you comment on that and whether you agree with it?

Matt Houlihan: I am very happy to. Two main comparators come to mind. One is the EU, and we have talked quite a bit about NIS2 and the progress that has made. NIS2 does take a slightly different approach to that of the UK Government, in that it outlines, I think, 18 different sectors, up from seven under NIS1. There is that wide scope in terms of NIS2.

Although NIS2 is an effective piece of legislation, the implementation of it remains patchy over the EU. Something like 19 of the 27 EU member states have implemented it to date in their national laws. There is clearly a bit of work still to do there. There is also some variation in how NIS2 is being implemented, which we feel as an international company operating right across the European Union. As has been touched on briefly, there is now a move, through what are called omnibus proposals, to simplify the reporting requirements and other elements of cyber-security and privacy laws across the EU, which is a welcome step.

I mentioned in a previous answer the work that Australia has been doing, and the Security of Critical Infrastructure Act 2018—SOCI—was genuinely a good standard and has set a good bar for expectations around the world. The Act has rigorous reporting requirements and caveats and guardrails for Government step-in powers. It also covers things like ransomware, which we know the UK Home Office is looking at, and Internet of Things security, which the UK Government recently looked at. Those are probably the two comparators. We hope that the CSRB will take the UK a big step towards that, but as a lot of my colleagues have said, there is a lot of work to do in terms of seeing the guidance and ensuring that it is implemented effectively.

Chris Anley: On the point about where we are perhaps falling behind, with streamlining of reporting we have already mentioned Australia and the EU, which is in progress. On protection of their defenders, other territories are already benefiting from those protections—the EU, the US, and I mentioned Portugal especially. As a third and final point, Australia is an interesting one, as it is providing a cyber-safety net to small and medium-sized enterprises, which provides cyber expertise from the Government to enable smaller entities to get up to code and achieve resilience where those entities lack the personnel and funding.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Q A huge thank you to the panel. Many of my colleagues have already asked the question, so I appreciate you talking about the futureproofing in quantum, the international regulatory environment and the use of standards alongside regulation to drive up quality. You all have a huge amount of UK clients, and I want to ask you about how good cyber culture gets embedded, and what the role of the Bill is within that. To pick up on Ben’s point around the security by design within his own firm, do you think that is well understood among your colleagues in the UK? How do we get the balance right between what is in the regulation and what should be done through a standards model, working with the British Standards Institution and others?

Dr Ian Levy: The previous set of witnesses talked about board responsibility around cyber-security. In my experience, whether a board is engaged or not is a proxy indicator for whether they are looking at risk management properly, and you cannot change corporate culture through regulation—not quickly. There is something to be done around incentives to ensure that companies are really looking at their responsibilities across cyber-security. As the previous panellists have said, this is not just a technical thing.

One of the things that is difficult to reconcile in my head—and always has been—is trying to levy national security requirements on companies that are not set up to do that. In this case I am not talking about Amazon Web Services, because AWS invests hugely in security. We have a default design principle around ensuring that the services are secure and private by design. But something to consider for the Bill is not accidentally putting national security requirements on those entities that cannot possibly meet them.

When I was in government, in the past we accidentally required tiny entities, which could not possibly do so, to defend themselves against the Russians in cyber-space. If you translate that to any other domain—for example, saying that a 10-person company should defend itself against Russian missiles—it is insane, yet we do it in cyber-space. Part of the flow-down requirements that we see for contracting, when there is a Bill like this one, ends up putting those national security requirements on inappropriate entities. I really think we need to be careful how we manage that.

Matt Houlihan: Can I make two very quick points?

None Portrait The Chair
- Hansard -

Very briefly—yes.

Matt Houlihan: My first point is on the scale of the challenge. From Cisco’s own research, we released a cyber-security readiness index, which was a survey of 8,000 companies around the world, including in the UK, where we graded companies by their cyber maturity. In the UK, 8% of companies—these are large companies—were in the mature bracket, which shows the scale of the challenge.

The other point I want to make relates to its being a cyber-security and resilience Bill, and the “resilience” bit is really important. We need to focus on what that means in practice. There are a lot of cyber measures that we need to put in place, but resilience is about the robustness of the technology being used, as well as the cyber-security measures, the people and everything else that goes with it. Looking at legacy technology, for example—obsolete technology, which is more at risk—should also be part of the standards and, perhaps, the regulatory guidance that is coming through. I know that the public sector is not part of the Bill, but I mention the following to highlight the challenge: over a year ago, DSIT published a report that showed, I think, that 28% of Government systems were in the legacy, unsupported, obsolete bracket. That highlights the nature of the challenge in this space.

Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting)

Emily Darlington Excerpts
David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

Q Do you know how you would do that information sharing at the moment?

Ian Hulme: As we have already explained, the current regs do not allow us to share the information, which is a bit of a barrier for us. In the future, certainly, we will be working together to try to figure it out. I think that there is also a role for DSIT in that.

Natalie Black: First, we currently have a real problem in that information sharing is much harder than it should be. The Bill makes a big difference in addressing that point, not only among ourselves but with DSIT and NCSC. Secondly, we think that there is an opportunity to improve information reporting, particularly incident reporting, and we would welcome working with DSIT and others—I have mentioned the Digital Regulation Cooperation Forum—to help us find a way to make it easier for industry, because the pace at which we need to move means that we want to ensure that there is no unnecessary rub in the system.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

Q I have a question for Ian Hulme. In your role at the ICO, you are clearly looking at data security. Data is obviously one of the main goals of cyber-attacks. Data issues cut across every sector, and you are looking at a really broad sector of data, from individual identifiers to names, addresses, bank accounts or whatever it might be. This could happen in any sector. How does the Bill give you additional powers to take action, particularly on those co-ordinated through AI or foreign actors, and do you think it is sufficient for what you feel we will be facing in the next five years?

Ian Hulme: We need to think about this as essentially two different regimes. The requirements under data protection legislation to report a data breach are well established, and we have teams, systems and processes that manage all that. There are some notable cases that have been in the public domain in recent months where we have levied fines against organisations for data breaches.

The first thing to realise is that we are still talking about only quite a small sub-sector—digital service providers, including cloud computing service providers, online marketplaces, search engines and, when they are eventually brought into scope, MSPs. A lot of MSPs will provide services for a lot of data controllers so, as I explained, if you have the resilience and security of information networks, that should help to make data more secure in the future.

Lincoln Jopp Portrait Lincoln Jopp (Spelthorne) (Con)
- Hansard - - - Excerpts

Q One of my favourite aphorisms is, “Institutions get the behaviours they reward.” We had a cry from Amazon Web Services this morning about how, when a regulator deals with a company in the event of a cyber-security attack, please remember you are dealing with a victim.

I have dealt with the ICO before. Maybe it was the company that I worked in and led, but there was a culture there that, if you had a data breach, you told the ICO. There was no question about it. How are you going to develop your reactions and the behaviours you reward in order to encourage a set of behaviours and cultures of openness within the corporate sector, bearing in mind that, as was said this morning, by opening that door, companies could be opening themselves up to a hefty fine?

Stuart Okin: In the energy sector, we have that culture. It is one of safety and security, and the chief executives and the heads of security really lean into it and understand that particular space. There are many different forums where they communicate and share that type of information with each other and with us. Incident response is really the purview of DESNZ rather than us, but they will speak to us about that from a regulatory perspective.

Ian Hulme: From the ICO’s perspective, we receive hundreds of data-breach reports. The vast majority of those are dealt with through information and guidance to the impacted organisation. It is only a very small number that go through to enforcement activity, and it is in only the most egregious cases—where failures are so egregious that, from a regulatory perspective, it would be a failure on our part not to take action.

I anticipate that is the approach we will take in the future when dealing with the instant reporting regime that the Bill sets out. Our first instinct would be to collaborate with organisations. Only in the most egregious cases would I imagine that we would look to exercise the full range of our powers.

Natalie Black: From Ofcom’s point of view, we have a long history, particularly in the telecoms sector, of dealing with a whole range of incidents, but I certainly hear your point about the victim. When I have personally dealt with some of these incidents, often you are dealing with a chief executive who has woken up that morning to the fact that they might lose their job and they have very stressed-out teams around them. It is always hard to trust the initial information that is coming out because no one really knows what is going on, certainly for the first few hours, so it is the maturity and experience that we would want to bring to this expanded role when it comes to data centres.

Ultimately the best regulatory relationships I have seen is where there is a lot of trust and openness that a regulator is not going to overreact. They are really going to understand what is going on and are very purposeful about what they are trying to achieve. From Ofcom’s point of view it is always about protecting consumers and citizens, particularly with one eye on security, resilience and economic growth. The experience we have had over the years means that we can come to those conversations with a lot of history, a lot of perspective, and, to be honest, a bit of sympathy because sometimes those moments are very difficult for everyone involved.

--- Later in debate ---
Tim Roca Portrait Tim Roca
- Hansard - - - Excerpts

Q From the other perspective—I am thinking about a UK Government in the future overreaching—do you think there is any risk from this legislation?

Chung Ching Kwong: It is always a double-edged sword when it comes to regulating against threats. The more that the Secretary of State or the Government are allowed to go into systems and hold powers to turn off, or take over, certain things, the more there is a risk that those powers will be abused, to a certain extent, or cause harm unintentionally. There is always a balance to be struck between giving more protection to privacy for ordinary users and giving power to the Government so that they can act. Obviously, for critical infrastructure like the power grid and water, the Government need control over those things, but for communications and so on, there is, to a certain extent, a question about what the Government can and cannot do. But personally I do not see a lot of concerns in the Bill.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Q I want to move from software to hardware that is particularly vulnerable to potential cyber-attack, particularly from the integration of Chinese tech into SIPs, possibly making them vulnerable to cyber-attack by someone who knows the code into those bits of hardware. Should we be doing more to protect against that vulnerability? Should that be covered by the Bill?

Chung Ching Kwong: It should definitely be covered by the Bill, because if we are not regulating to protect hardware as well, we will get hardware that is already embedded with, for example, an opcode attack. Examples in the context of China include the Lenovo Superfish scandal in 2015, in which originally implemented ad software had hijacked the https certificate, which is there to protect your communication with the website, so that nobody sees what activity is happening between you and the website. Having that Superfish injection made that communication transparent. That was done before the product even came out of the factory. This is not a problem that a software solution can fix. If you were sourcing a Lenovo laptop, for example, the laptop, upon arrival, would be a security breach, and a privacy breach in that sense. We should definitely take it a step further and regulate hardware as well, because a lot of the time that is what state-sponsored attacks target as an attack surface.

None Portrait The Chair
- Hansard -

That brings us nicely to the end of the time allotted for the Committee to ask questions. On behalf of the Committee, I thank our witness for her evidence.

Examination of Witness

Professor John Child gave evidence.

--- Later in debate ---
Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Q I am just thinking that if you are putting liability on someone, you need to make sure that they can apply the regulation in a simple and effective manner and ensure that it is enforced, so they do not carry the full burden of liability.

Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Q I note your interest in how the Bill will affect smaller businesses. There is not much detail in the Bill, but how do you think the code of practice could create an environment that lifts everyone’s security up without prescribing too great a burden?

Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”

One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.

David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

Q We have heard concerns about definitions, particularly regarding incident reporting. What are your observations on the Bill as it stands, and those definitions?

Richard Starnes: Throughout my career, I have been involved in cyber incidents from just about day one. One of the biggest problems that you run into in the first 72 hours, for example, is actually determining whether you have been breached. Just because it looks bad does not mean it is bad. More times than not, you have had indicators of compromise, and you have gone through the entire chain, which has taken you a day, or maybe two or three days, of very diligent work with very clever people to determine that, no, you have not been breached; it was a false positive that was difficult to track down. Do you want to open the door to a regulator coming in and then finding out it is a false positive?

You are also going to have a very significant problem with the amount of alerts that you get with a 24-hour notification requirement, because there is going to be an air of caution, particularly with new legislation. Everybody and his brother is going to be saying, “We think we’ve got a problem.” Alternatively, if they do not, then you have a different issue.

Social Media: Non-consensual Sexual Deepfakes

Emily Darlington Excerpts
Monday 12th January 2026

(1 month, 2 weeks ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - -

I thank the Secretary of State for her absolutely clear message that what X is doing, through the use of Grok, is illegal. That is as much the platform’s responsibility as it is the user’s. I am afraid that there is less confidence in Ofcom’s ability to enforce the Online Safety Act as it stands, or in the improvements being made. Does she agree with the many people across the country who believe that we need to see real action from Ofcom by the end of this week, or we will judge Ofcom’s leadership as failing the British public?

Liz Kendall Portrait Liz Kendall
- View Speech - Hansard - - - Excerpts

My hon. Friend is a powerful champion on this issue. I am a feminist; I believe in deeds, not words. The deeds and the action will provide the proof that the very tough legislation already in place must be implemented—British rule of law. Ofcom needs to act, and swiftly.

AI Safety

Emily Darlington Excerpts
Wednesday 10th December 2025

(2 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

It is a pleasure to serve under your chairship, Ms Butler. I thank the hon. Member for Dewsbury and Batley (Iqbal Mohamed) for securing this important debate.

It would be remiss of me, as the MP for Milton Keynes Central, not to acknowledge the opportunities of AI. One in three jobs in Milton Keynes is in tech, often in the edge technologies or edge AIs that are driving the economic growth we want. However, we will not see take-up across businesses unless we have the safest AI, so we must listen to the British Standards Institution, which is located in Milton Keynes and is working on standards for some of these things.

Nevertheless, I have many concerns. The Molly Rose Foundation has raised many issues around AI chatbots, not all of which are included in current legislation. It has documented how Alexa instructed a 10-year-old to touch a live electrical wire, and how Snapchat’s My AI told a 13-year-old how to lose their virginity to a 31-year-old—luckily, it was an adult posing as a 13-year-old. We have seen other examples involving suicide, and Hitler having the answers to climate change, and research has found that many children are unable to realise that chatbots are not human. AI algorithms also shadow ban women and women’s health, as others have mentioned.

The tech is there to make AI safe, but there is little incentive for companies to do so at the moment. The Online Safety Act goes some way, but not far enough. Our priorities must be to tackle the creativity and copyright issues; deepfakes and the damage they do, in particular, to young girls and women; and the misinformation and disinformation that is being spread and amplified by algorithms because it keeps people online longer, making companies money. We must also protect democracy, children, minorities and women.

How do we do that? I hope the Minister is listening. For me, it is about regulation and standards—standards are just as important as regulation—and transparency. The Science, Innovation and Technology Committee has called for transparency on AI algorithms and AI chatbots, but we have yet to see real transparency. We must also have more diversity in tech—I welcome the Secretary of State’s initiatives on that—and, finally, given the world we are in, we must have a clear strategy for the part that sovereignty in AI plays in our security and our economic future.

Dawn Butler Portrait Dawn Butler (in the Chair)
- Hansard - - - Excerpts

Order. I would like to try to allow two minutes at the end for the Member in charge to wind up the debate. Will the Front Benchers take that into account, please?

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

My hon. Friend brings deep expertise from her past career. If she feels there are particular absences in the legislation on equalities, I would be happy to take a look, though that has not been pointed out to me, to date.

The Online Safety Act 2023 requires platforms to manage harmful and illegal content risks, and offers significant protection against harms online, including those driven by AI services. We are supporting regulators to ensure that those laws are respected and enforced. The AI action plan commits to boosting AI capabilities through funding, strategic steers and increased public accountability.

There is a great deal of interest in the Government’s proposals for new cross-cutting AI regulation, not least shown compellingly by my right hon. Friend the Member for Oxford East (Anneliese Dodds). The Government do not speculate on legislation, so I am not able to predict future parliamentary sessions, although we will keep Parliament updated on the timings of any consultation ahead of bringing forward any legislation.

Notwithstanding that, the Government are clearly not standing still on AI governance. The Technology Secretary confirmed in Parliament last week that the Government will look at what more can be done to manage the emergent risks of AI chatbots, raised by my hon. Friend the Member for York Outer (Mr Charters), my right hon. Friend the Member for Oxford East, my hon. Friend the Member for Milton Keynes Central and others.

Alongside the comments the Technology Secretary made, she urged Ofcom to use its existing powers to ensure AI chatbots in scope of the Act are safe for children. Further to the clarifications I have provided previously across the House, if hon. Members have a particular view on where there are exceptions or spaces in the Online Safety Act on AI chatbots that correlate with risk, we would welcome any contribution through the usual correspondence channels.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Will the Minister give way?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I have about two minutes, so I will continue the conversation with my hon. Friend outside.

We will act to ensure that AI companies are able to make their own products safe. For example, the Government are tackling the disgusting harm of child sexual exploitation and abuse with a new offence to criminalise AI models that have been optimised for that purpose. The AI Security Institute, which I was delighted to hear praised across the House, works with AI labs to make their products safer and has tested over 30 models at the frontier of development. It is uniquely the best in the world at developing partnerships, understanding security risks, and innovating safeguards, too. Findings from AISI testing are used to strengthen model safeguards in partnership with AI companies, improving safety in areas such as cyber-tasks and biological weapon development.

The UK Government do not act alone on security. In response to the points made by the hon. Members for Ceredigion Preseli (Ben Lake), for Harpenden and Berkhamsted, and for Runnymede and Weybridge, it is clear that we are working closely with allies to raise security standards, share scientific insights and shape responsible norms for frontier AI. We are leading discussions on AI at the G7, the OECD and the UN. We are strengthening our bilateral relationships on AI for growth and security, including AI collaboration as part of recent agreements with the US, Germany and Japan.

I will take the points raised by the hon. Members for Dewsbury and Batley, for Winchester (Dr Chambers) and for Strangford, and by my hon. Friend the Member for York Outer (Mr Charters) on health advice, and how we can ensure that the quality of NHS advice is privileged in wider AI chatbot engagement, as well as the points made by my hon. Friend the Member for Congleton and my right hon. Friend the Member for Oxford East on British Sign Language standards in AI, which are important points that I will look further at.

To conclude, the UK is realising the opportunities for transformative AI while ensuring that growth does not come at the cost of security and safety. We do this through stimulating AI safety assurance markets, empowering our regulators and ensuring our laws are fit for purpose, driving change through AISI and diplomacy.

Digital ID

Emily Darlington Excerpts
Monday 13th October 2025

(4 months, 1 week ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Liz Kendall Portrait Liz Kendall
- View Speech - Hansard - - - Excerpts

We absolutely will not. If the hon. Gentleman would like to write to me with more detail about areas and groups of people in his constituency who are digitality excluded, I will make a commitment to doing everything possible to tackle that problem.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - -

I think that many Members have fundamentally misunderstood the proposal. It is actually about putting power in the hands of the citizen, not the state. The state already holds this information; digital ID will allow citizens to access it. On fraud, £11.4 billion was lost in scams last year, and £1.8 billion per year is lost due to identity theft. Does the Secretary of State see a role for digital ID in cracking down on the growing problem of fraud and identity theft?

Liz Kendall Portrait Liz Kendall
- View Speech - Hansard - - - Excerpts

I absolutely do. The countries that have introduced digital ID have found that it helps to tackle fraud. People can lose forms of identity and they can be used by other people. The scheme will help to tackle that problem as well as make services more effective and efficient.

Oral Answers to Questions

Emily Darlington Excerpts
Wednesday 25th June 2025

(8 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

One really important part of the industrial strategy we published on Monday and the sector plans within it is that we identified a problem many people in the UK face, which is that they have a really good idea but cannot take it to market because they do not have access to finance, in particular to capital, unless they are in London—and sometimes unless they are a man. We want to change all that, which is why we have said categorically that we are giving the British Business Bank much more significant power to be able to invest in these sectors. That will mean we are a powerhouse in precisely the way the hon. Member wants.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

7. What assessment he has made of the potential impact of funding for health science and innovation on the UK’s global influence.

Chris Bryant Portrait The Minister for Data Protection and Telecoms (Chris Bryant)
- View Speech - Hansard - - - Excerpts

From the development of vaccines to the discovery of the structure of DNA, British medical innovation has played a fundamental role in changing the lives of people globally and extending the UK’s global influence. Our industrial strategy and forthcoming life sciences sector plan will put the UK at the very centre of global efforts.

Emily Darlington Portrait Emily Darlington
- View Speech - Hansard - -

As the Minister will know, Gavi and the Global Fund not only provide a global vaccine programmes and programmes on saving lives from malaria and HIV, but provide us with biosecurity and jobs in the UK, not least over 500 research and development jobs and funding for the institute of tropical medicine. What assessment has he made of whether the UK is to reduce our efforts in that regard?

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

Gavi, the Vaccine Alliance is absolutely essential, not only for other countries in the world, where we have managed to save many lives by introducing vaccines, but for UK innovation. We are fully committed to Gavi. We will be producing our life sciences sector plan soon, and we want to celebrate the sector, which represents 6,800 business and £100 billion of turnover every year.