(2 weeks, 4 days ago)
Public Bill Committees
Kanishka Narayan
I thank the hon. Member for his point. I am also aware that the National Cyber Security Centre’s cyber assessment framework has very specific measures on appropriate testing as well. It already exists, and we want to make sure that it is an important part of specific security and resilience requirements in secondary legislation.
It is crucial that industry is consulted on the nature of any requirements related to testing. As mentioned, we intend to consult on the proposals later in the year. We will also issue a statement of strategic priorities for regulators, and will explore whether that is an appropriate vehicle for driving consistency in the behaviours of regulators in respect of their approach to testing for their sector.
Overall, any approach to going further on proportionate and regular testing must be developed alongside the full set of security and resilience requirements, and co-ordinated and communicated with a wider package of implementing measures. That will allow the impact of options to be assessed, and provide the industry with clarity on the overall approach, including how the components fit together.
The shadow Minister asked about the consideration of NIS2 requirements. We have looked at NIS2 provisions, and variability in member states’ implementation of it, as part of a wider set of considerations on which we will be consulting regarding secondary legislation on governance.
My hon. Friend the Member for Milton Keynes Central made an incredibly important point about security by design, which I very much take into account. The Government Digital Service is already working on a secure by design standard. We want to make sure that it is as robust as possible, and extend it across not just the public sector but parts of the private sector. I will make sure that security by design remains at the heart of the Government’s cyber action plan, as well as that of the private sector.
Emily Darlington
I thank the Minister for that commitment. Would he consider setting up a meeting between GDS and those MPs who have expertise in this area, so that we can share our expertise and reassure ourselves that this is going in the right direction and at the speed that is necessary?
Kanishka Narayan
My hon. Friend has extensive expertise, from which I benefit extensively. I will be keen to make sure that the Government Digital Service does so too.
In the light of those commitments, I kindly ask the hon. Member for Brecon, Radnor and Cwm Tawe not to press the new clauses.
(1 month ago)
Public Bill Committees
Emily Darlington (Milton Keynes Central) (Lab)
I have a few questions for the Minister. I appreciate the clarity that the Bill brings to many of the services in its scope. I would like to understand how the definition of “incidents” will relate to hardware vulnerabilities that are discovered within a company, as we heard from some of the people who gave evidence to the Committee. It is unclear in the Bill. Perhaps it will be further defined in secondary legislation.
I want to understand how an incident in which someone discovers a vulnerability in hardware—such as in a system-in-package—is reported, and how that information is then delivered by the regulator to other companies in the sector that may have similar technology, and to the other regulators, which may also want to flag that technology as a particular vulnerability. Is that defined as an “incident” or is it defined somewhere else in the Bill? I am a bit confused and am looking for some clarity.
Kanishka Narayan
Having been promoted from a position of mere confidence to faith, I will tackle questions from the hon. Member for Runnymede and Weybridge first and foremost. On the question of thresholds of incident, the Bill sets out the severity of the sorts of incidents that we expect reporting obligations to apply to, and at the same time it ensures that it is proportionate in understanding that sector-specific thresholds ought to be precisely that—sector specific, set closely with relevant entities in that sector, and working with the expertise of the relevant regulators. For that reason, it has not been specified more fully on the face of the Bill.
On information sharing, not only is there provision for the specific sets of purposes for which information sharing ought to take place between regulators, but there is a further check on the proportionality of that, through a particular requirement, to ensure that information that is shared in incident contexts is done precisely for the purposes set out in the Bill, and in a way that is proportionate.
My hon. Friend the Member for Milton Keynes Central raised the question of hardware impacts. While the focus of the Bill is primarily on network and information systems, the test, as I think of it, would look at whether any compromise in network and information systems related to a piece of hardware triggers the severity of the impact, or potential impact, to be reportable. In the event that it is reportable, in its severity and potential impact, it will require notification—to the regulator and, when customers are directly impacted in the way that is set out in the Bill, also to the customers. The test is focused on whether network and information systems are engaged, and whether the impact of any incident is likely to be severe enough, in light of the thresholds set out in the Bill.
Emily Darlington
Again, I welcome the Government amendments and clause 18; they are important to enabling us to share our vulnerabilities in an appropriate way with those people who may be involved. However, some of the aspects of those vulnerabilities that security services—GCHQ, His Majesty’s Government Communications Centre and others—raised with us relate particularly to not only foreign interference, but the potential for interference through technology embedded in our networks. How does the Minister see the measures working within our co-operation with different foreign nations, particularly during these volatile times?
Kanishka Narayan
In response to the shadow Minister’s first question about ensuring sensitive handling of shared information and proportionality, all information handled by regulators ought to be treated carefully and with awareness of its importance. The regulators have to act reasonably, and the NIS regulations specifically require information obtained from inspections to be held securely. Of course, data protection laws apply to regulators as well. Alongside that, regulators will be required to consider the relevance and proportionality of sharing their information to the purposes set out in the Bill; as I have mentioned, the Bill includes specific purposes for why information might be shared.
(1 month ago)
Public Bill Committees
Emily Darlington (Milton Keynes Central) (Lab)
As the Minister will be aware, I have spoken consistently of my concern about our reliance on hardware and tech that comes from potentially non-favourable state actors abroad. That also relates to Government procurement, which I have raised before, as the Minister will know.
The Committee has already discussed how local government and Government Departments are not covered by this legislation, and how there is a separate strategy and document. Can the Minister expand on how protections against a reliance on foreign tech within critical infrastructure, in either the private or the public sector, are being dealt with in the Bill or in the strategy that has been published for the public sector? How will that be continually reviewed as our global geopolitical situation remains unstable?
Kanishka Narayan
I will start by addressing amendment 27, moved by the hon. Member for Brecon, Radnor and Cwm Tawe, which would add to the non-exhaustive list of requirements that could be included in a national security direction. It specifies that a direction could include requirements to
“remove, disable or modify hardware, software or other facilities”.
I reassure him that the Bill, as currently drafted, allows the Secretary of State to impose those types of requirements. Clause 43(3)(f) specifies that a direction may include
“a requirement relating to removing, disabling or modifying goods or facilities or modifying services”.
That already encompasses the types of requirements specified in amendment 27.
Furthermore, clause 43(3) lists the requirements that may “in particular” be included in a direction. The list is therefore not exhaustive, and for good reason. It is not possible or desirable to specify every action that might be needed to address a national security risk. That would restrict the Government’s potential avenues to address urgent national security threats, and would risk the legislation being too narrow to address novel threats to the UK’s national security.
(1 month, 1 week ago)
Public Bill Committees
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
Q
Jen Ellis: Again, that is a hugely complex question to cover in a short amount of the time. One of the challenges that we face in UK is that we are a 99% small and mediums economy. It is hard to think about how to place more burdens on small and medium businesses, what they can reasonably get done and what resources are available. That said, that is the problem that we have to deal with; we have to figure out how to make progress.
There is also a challenge here, in that we tend to focus a lot on the behaviour of the victim. It is understandable why—that is the side that we can control—but we are missing the middle piece. There are the bad guys, who we cannot control but who we can try to prosecute and bring to task; and there are the victims, who we can control, and we focus a lot on that—CSRB focuses on that side. Then there is the middle ground of enablers. They are not intending to be enablers, but they are the people who are creating the platforms, mediums and technology. I am not sure that we are where we could be in thinking about how to set a baseline for them. We have a lot of voluntary codes, which is fantastic—that is a really good starting point—but it is about the value of the voluntary and how much it requires behavioural change. What you see is that the organisations that are already doing well and taking security seriously are following the voluntary codes because they were already investing, but there is a really long tail of organisations that are not.
Any policy approach, legislation or otherwise, comes down to the fact that you can build the best thing in the world, but you need a plan for adoption or the engagement piece—what it looks like to go into communities and see how people are wrestling with this stuff and the challenges that are blocking adoption. You also need to think about how to address and remove those challenges, and, where necessary, how to ensure appropriate enforcement, accountability and transparency. That is critical, and I am not sure that we see a huge amount of that at the moment. That is an area where there is potential for growth.
With CSRB, the piece around enforcement is going to be critical, and not just for the covered entities. We are also giving new authorities to the regulators, so what are we doing to say to them, “We expect you to use them, to be accountable for using them and to demonstrate that your sector is improving”? There needs to be stronger conversations about what it looks like to not meet the requirements. We should be looking more broadly, beyond just telling small companies to do more. If we are going to tell small companies to do more, how do we make it something that they can prioritise, care about and take seriously, in the same way that health and safety is taken seriously?
David Cook: To achieve the outcome in question, which is about the practicalities of a supply chain where smaller entities are relying on it, I can see the benefit of bringing those small entities in scope, but there could be something rather more forthright in the legislation on how the supply chain is dealt with on a contractual basis. In reality, we see that when a smaller entity tries to contract with a much larger entity—an IT outsourced provider, for example—it may find pushback if the contractual terms that it asks for would help it but are not required under legislation.
Where an organisation can rely on the GDPR, which has very specific requirements as to what contracts should contain, or the Digital Operational Resilience Act, which is a European financial services law and is very prescriptive as to what a contract must contain, any kind of entity doing deals and entering into a contract cannot really push back, because the requirements are set out in stone. The Bill does not have a similar requirement as to what a contract with providers might look like.
Pushing that requirement into the negotiation between, for example, a massive global IT outsourced provider and a much smaller entity means either that we will see piecemeal clauses that do not always achieve the outcomes you are after, or that we will not see those clauses in place at all because of the commercial reality. Having a similarly prescriptive set of requirements for what that contract would contain means that anybody negotiating could point to the law and say, “We have to have this in place, and there’s no wriggle room.” That would achieve the outcome you are after: those small entities would all have identical contracts, at least as a baseline.
Emily Darlington (Milton Keynes Central) (Lab)
Q
David Cook: The original NIS regulations came out of a directive from 2016, so this is 10 years old now, and the world changes quickly, especially when it comes to technology. Not only is this supply chain vulnerability systemic, but it causes a significant risk to UK and global businesses. Ransomware groups, threat actors or cyber-criminals—however you want to badge that—are looking for a one-to-many model. Rather than going after each organisation piecemeal, if they can find a route through one organisation that leads to millions, they will always follow it. At the moment, they are out of scope.
The reality is that those organisations, which are global in nature, often do not pay due regard to UK law because they are acting all over the world and we are one of many jurisdictions. They are the threat vector that is allowing an attack into an organisation, but it then sits with the organisations that are attacked to deal with the fallout. Often, although they do not get away scot-free, they are outside legislative scrutiny and can carry on operating as they did before. That causes a vulnerability. The one-to-many attack route is a vulnerability, and at the moment the law is lacking in how it is equipped to deal with the fallout.
Jen Ellis: In terms of what the landscape looks like, our dialogue often has a huge focus on cyber-crime and we look a lot at data protection and that kind of thing. Last year, we saw the impact of disruptive attacks, but in the past few years we have also heard a lot more about state-sponsored attacks.
I do not know how familiar everyone in the room is with Volt Typhoon and Salt Typhoon; they were widespread nation-state attacks that were uncovered in the US. We are not immune to such attacks; we could just as easily fall victim to them. We should take the discovery of Volt Typhoon as a massive wake-up call to the fact that although we are aware of the challenge, we are not moving fast enough to address it. Volt Typhoon particularly targeted US critical infrastructure, with a view to being able to massively disrupt it at scale should a reason to do so arise. We cannot have that level of disruption across our society; the impacts would be catastrophic.
Part of what NIS is doing and what the CSRB is looking to do is to take NIS and update it to make sure that it is covering the relevant things, but I also hope that we will see a new level of urgency and an understanding that the risks are very prevalent and are coming from different sources with all sorts of different motivations. There is huge complexity, which David has spoken to, around the supply chain. We really need to see the critical infrastructure and the core service providers becoming hugely more vigilant and taking their role as providers of a critical service very seriously when it comes to security. They need to think about what they are doing to be part of the solution and to harden and protect the UK against outside interference.
David Cook: By way of example, NIS1 talks about reporting to the regulator if there is a significant impact. What we are seeing with some of the attacks that Jen has spoken about is pre-positioning, whereby a criminal or a threat actor sits on the network and the environment and waits for the day when they are going to push the big red button and cause an attack. That is outside NIS1: if that sort of issue were identified, it would not be reportable to the regulator. The regulator would therefore not have any visibility of it.
NIS2 and the Bill talk about something being identified that is caused by or is capable of causing severe operational disruption. It widens the ambit of visibility and allows the UK state, as well as regulators, to understand what is going in the environment more broadly, because if there are trends—if a number of organisations report to a regulator that they have found that pre-positioning—they know that a malicious actor is planning something. The footprints are there.
(3 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Kanishka Narayan
My hon. Friend brings deep expertise from her past career. If she feels there are particular absences in the legislation on equalities, I would be happy to take a look, though that has not been pointed out to me, to date.
The Online Safety Act 2023 requires platforms to manage harmful and illegal content risks, and offers significant protection against harms online, including those driven by AI services. We are supporting regulators to ensure that those laws are respected and enforced. The AI action plan commits to boosting AI capabilities through funding, strategic steers and increased public accountability.
There is a great deal of interest in the Government’s proposals for new cross-cutting AI regulation, not least shown compellingly by my right hon. Friend the Member for Oxford East (Anneliese Dodds). The Government do not speculate on legislation, so I am not able to predict future parliamentary sessions, although we will keep Parliament updated on the timings of any consultation ahead of bringing forward any legislation.
Notwithstanding that, the Government are clearly not standing still on AI governance. The Technology Secretary confirmed in Parliament last week that the Government will look at what more can be done to manage the emergent risks of AI chatbots, raised by my hon. Friend the Member for York Outer (Mr Charters), my right hon. Friend the Member for Oxford East, my hon. Friend the Member for Milton Keynes Central and others.
Alongside the comments the Technology Secretary made, she urged Ofcom to use its existing powers to ensure AI chatbots in scope of the Act are safe for children. Further to the clarifications I have provided previously across the House, if hon. Members have a particular view on where there are exceptions or spaces in the Online Safety Act on AI chatbots that correlate with risk, we would welcome any contribution through the usual correspondence channels.
Kanishka Narayan
I have about two minutes, so I will continue the conversation with my hon. Friend outside.
We will act to ensure that AI companies are able to make their own products safe. For example, the Government are tackling the disgusting harm of child sexual exploitation and abuse with a new offence to criminalise AI models that have been optimised for that purpose. The AI Security Institute, which I was delighted to hear praised across the House, works with AI labs to make their products safer and has tested over 30 models at the frontier of development. It is uniquely the best in the world at developing partnerships, understanding security risks, and innovating safeguards, too. Findings from AISI testing are used to strengthen model safeguards in partnership with AI companies, improving safety in areas such as cyber-tasks and biological weapon development.
The UK Government do not act alone on security. In response to the points made by the hon. Members for Ceredigion Preseli (Ben Lake), for Harpenden and Berkhamsted, and for Runnymede and Weybridge, it is clear that we are working closely with allies to raise security standards, share scientific insights and shape responsible norms for frontier AI. We are leading discussions on AI at the G7, the OECD and the UN. We are strengthening our bilateral relationships on AI for growth and security, including AI collaboration as part of recent agreements with the US, Germany and Japan.
I will take the points raised by the hon. Members for Dewsbury and Batley, for Winchester (Dr Chambers) and for Strangford, and by my hon. Friend the Member for York Outer (Mr Charters) on health advice, and how we can ensure that the quality of NHS advice is privileged in wider AI chatbot engagement, as well as the points made by my hon. Friend the Member for Congleton and my right hon. Friend the Member for Oxford East on British Sign Language standards in AI, which are important points that I will look further at.
To conclude, the UK is realising the opportunities for transformative AI while ensuring that growth does not come at the cost of security and safety. We do this through stimulating AI safety assurance markets, empowering our regulators and ensuring our laws are fit for purpose, driving change through AISI and diplomacy.