Cyber Security and Resilience (Network and Information Systems) Bill (First sitting) Debate
Full Debate: Read Full DebateTim Roca
Main Page: Tim Roca (Labour - Macclesfield)Department Debates - View all Tim Roca's debates with the Department for Science, Innovation & Technology
(2 days, 3 hours ago)
Public Bill Committees
Bradley Thomas (Bromsgrove) (Con)
Q
Jen Ellis: For sure, it should not come down to whether you are public or private; it should be about impact. Figuring out how to measure that is challenging. I will leave that problem with policymakers—you’re welcome. I do not think it is about the number of employees. We have to think about impact in a much more pragmatic way. In the tech sector, relatively small companies can have a very profound impact because they happen to be the thing that is used by everybody. Part of the problem with security is that you have small teams running things that are used ubiquitously.
We have to think a little differently about this. We have seen outages in recent years that are not necessarily maliciously driven, but have demonstrated to us how reliant we are on technology and how widespread the impact can be, even of something like a local managed service provider. One that happened to provide managed services for a whole region’s local government went down in Germany and it knocked out all local services for some time. You are absolutely right: we should be looking at privately held companies as well. We should be thinking about impact, but measuring impact and figuring out who is in scope and who is not will be really challenging. We will have to start looking down the supply chain, where it gets a lot more complex.
Tim Roca (Macclesfield) (Lab)
Q
Jen Ellis: As a starting point, I will clarify that I am a fellow at RUSI. I work closely with Jamie, but I do not work for RUSI. I also take no responsibility for Jamie’s comments.
On the comparisons, David alluded to the fact that Europe is a little bit ahead of us. NIS2, its update to NIS1, came into force three years ago with a dangling timeline: nations had until October 2024 to implement it. My understanding is that not everybody has implemented it amazingly effectively as yet. There is some lag across the member states. I do not think we are too out of scope of what NIS2 includes. However, we are talking about primary legislation now; a lot of the detail will be in the secondary legislation. We do not necessarily know exactly how those two things will line up against each other.
The UK seems to be taking a bit of a different approach. The EU has very specifically tried to make the detail as clearly mandated as possible, because it wants all the member states to adopt the same basis of requirements, which is different from NIS1, whereas it seems as though the UK wants to provide a little bit of flexibility for the regulators to “choose their own adventure”. I am not sure that is the best approach. We might end up with a pretty disparate set of experiences. That might be really confusing for organisations that are covered by more than one competent authority.
The main things that NIS2 and CSRB are looking at are pretty aligned. There is a lot of focus on the same things. It is about expanding scope to make sure that we keep up with what we believe “essential” now looks at, and there is a lot of focus on increased incident reporting and information sharing. Again, the devil will be in the detail in the secondary legislation.
The other thing I would say goes back to the earlier question about what is happening internationally. The nations that David mentioned, like Australia or the jurisdiction around the EU, are really proactive on cyber policy—as is the UK. They are taking a really holistic view, which David alluded to in his introduction, and are really looking at how all the pieces fit together. I am not sure that it is always super clear that the UK is doing the same. I think there is an effort to do so, and UK policymakers are very proactive on cyber policy and are looking at different areas to work on, but the view of how it all goes together may not be as clear. One area where we are definitely behind is legislating around vendor behaviour and what we expect from the people who are making and selling technology.
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
Q
Jen Ellis: Again, that is a hugely complex question to cover in a short amount of the time. One of the challenges that we face in UK is that we are a 99% small and mediums economy. It is hard to think about how to place more burdens on small and medium businesses, what they can reasonably get done and what resources are available. That said, that is the problem that we have to deal with; we have to figure out how to make progress.
There is also a challenge here, in that we tend to focus a lot on the behaviour of the victim. It is understandable why—that is the side that we can control—but we are missing the middle piece. There are the bad guys, who we cannot control but who we can try to prosecute and bring to task; and there are the victims, who we can control, and we focus a lot on that—CSRB focuses on that side. Then there is the middle ground of enablers. They are not intending to be enablers, but they are the people who are creating the platforms, mediums and technology. I am not sure that we are where we could be in thinking about how to set a baseline for them. We have a lot of voluntary codes, which is fantastic—that is a really good starting point—but it is about the value of the voluntary and how much it requires behavioural change. What you see is that the organisations that are already doing well and taking security seriously are following the voluntary codes because they were already investing, but there is a really long tail of organisations that are not.
Any policy approach, legislation or otherwise, comes down to the fact that you can build the best thing in the world, but you need a plan for adoption or the engagement piece—what it looks like to go into communities and see how people are wrestling with this stuff and the challenges that are blocking adoption. You also need to think about how to address and remove those challenges, and, where necessary, how to ensure appropriate enforcement, accountability and transparency. That is critical, and I am not sure that we see a huge amount of that at the moment. That is an area where there is potential for growth.
With CSRB, the piece around enforcement is going to be critical, and not just for the covered entities. We are also giving new authorities to the regulators, so what are we doing to say to them, “We expect you to use them, to be accountable for using them and to demonstrate that your sector is improving”? There needs to be stronger conversations about what it looks like to not meet the requirements. We should be looking more broadly, beyond just telling small companies to do more. If we are going to tell small companies to do more, how do we make it something that they can prioritise, care about and take seriously, in the same way that health and safety is taken seriously?
David Cook: To achieve the outcome in question, which is about the practicalities of a supply chain where smaller entities are relying on it, I can see the benefit of bringing those small entities in scope, but there could be something rather more forthright in the legislation on how the supply chain is dealt with on a contractual basis. In reality, we see that when a smaller entity tries to contract with a much larger entity—an IT outsourced provider, for example—it may find pushback if the contractual terms that it asks for would help it but are not required under legislation.
Where an organisation can rely on the GDPR, which has very specific requirements as to what contracts should contain, or the Digital Operational Resilience Act, which is a European financial services law and is very prescriptive as to what a contract must contain, any kind of entity doing deals and entering into a contract cannot really push back, because the requirements are set out in stone. The Bill does not have a similar requirement as to what a contract with providers might look like.
Pushing that requirement into the negotiation between, for example, a massive global IT outsourced provider and a much smaller entity means either that we will see piecemeal clauses that do not always achieve the outcomes you are after, or that we will not see those clauses in place at all because of the commercial reality. Having a similarly prescriptive set of requirements for what that contract would contain means that anybody negotiating could point to the law and say, “We have to have this in place, and there’s no wriggle room.” That would achieve the outcome you are after: those small entities would all have identical contracts, at least as a baseline.
Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting) Debate
Full Debate: Read Full DebateTim Roca
Main Page: Tim Roca (Labour - Macclesfield)Department Debates - View all Tim Roca's debates with the Department for Science, Innovation & Technology
(2 days, 3 hours ago)
Public Bill Committees
Bradley Thomas
Q
Ian Hulme: At the moment, to give you a few broad numbers our teams are around 15 people, and we anticipate doubling that. In the future, with self-funding, we will be a bit more in control of our own destiny. It is a significant uplift from our perspective.
Natalie Black: The challenge is that the devil is in the detail. Until that detail has worked through secondary legislation, we will have to reserve our position, so that we give you accurate numbers in due course. From Ofcom’s point of view, it is about adding 10s rather than significant numbers. I do not think we are that far off the ICO.
But I want to emphasise that this is about quality, not necessarily quantity. Companies want to work with expert regulators who really know what they are doing. Ofcom is building on the work we are already doing under the Telecommunications (Security) Act 2021. It will be a question of reinforcing that team, rather than setting up a separate one. We want to get the best, high-quality individuals who know how to talk to industry and really know cyber-security, to make sure people have a good experience when engaging with us.
Ian Hulme: To add to that, the one challenge we will face as a group is that we are all fishing in the same pond for skills. MSPs and others will also be fishing in that pond from the sector side. There needs to be recognition that there is going to be a skills challenge in this implementation.
Stuart Okin: To specifically pick up on the numbers, we have a headcount of 43 who are dedicated within cyber regulation. That also includes the investment side. We also have access to the engineering team—the engineering directorate—which is a separate team. There is also our enforcement directorate, as well as the legal side of things. The scope changes proposed in the Bill are just the large load controllers and supply chain, so we are not expecting a major uplift. These will be small numbers in comparison. Unlike my colleagues, we are not expecting a big uplift in resourcing.
Tim Roca (Macclesfield) (Lab)
Q
Ian Hulme: There are two angles to that. From a purely planning and preparation perspective, it is incredibly difficult, without having seen the detail, to know precisely what is expected of MSPs and IDSPs in the future, and therefore what the regulatory activity will be. That is why, when I am answering questions for colleagues, it is difficult to be precise about those numbers.
Equally, we are hearing from industry that it wants that precision as well. What is the expectation on it regarding incident reporting? What does “significant impact” mean? Similarly, with the designation of critical suppliers, precision is needed around the definitions. From a regulatory perspective, without that precision, we will probably find ourselves in a series of potential cases arguing about the definition of an issue. To give an example, if the definition of MSP is vague, and we are saying to an MSP that we think it is in scope, and it is saying, “No, we are not,” then a lot of our time and attention will be taken up with those types of arguments and disputes. Precision will be key for us.
Tim Roca
Q
Ian Hulme: There is a balance to be struck. When something is written on the face of the Bill and things change—and we know that this is a fast-moving sector—it makes it incredibly difficult to change things. There is a balance to be struck between primary and secondary, but what we are hearing and saying is that more precision around some of the definitions will be critical.
Natalie Black: I strongly agree with Ian. A regulator is only as good as the rules that it enforces. If you want us to hold the companies to account, we need to be absolutely clear on what you are asking us to do. The balance is just about right in terms of primary and secondary, particularly because the secondary vehicle gives us the opportunity to ensure that there is a lot of consultation. The Committee will have heard throughout the day—as we do all the time from industry—that that is what industry is looking for. They are looking for periods of business adjustment—we hear that loud and clear—and they really want to be involved in the consultation period. We also want to be involved in looking at what we need to take from the secondary legislation into codes of practice and guidance.
Q
Natalie Black: That is a great question, and I am not at all surprised that you have asked it, given everything that is going on at the moment. As well as being group director for infrastructure and connectivity, I am also the executive member of the board, sitting alongside our chief executive officer, so from first-hand experience I can say that Ofcom really recognises how fast technology is changing. I do not think there is another sector that is really at the forefront of change in this way, apart from the communications sector. There are a lot of benefits to being able to sit across all that, because many of the stakeholders and issues are the same, and our organisation is learning to evolve and adapt very quickly with the pace of change. That is why the Bill feels very much like a natural evolution of our responsibility in the security and resilience space.
We already have substantial responsibilities under NIS and the Telecommunications (Security) Act 2021. We are taking on these additional responsibilities, particularly over data centres, but we already know some of the actors and issues. We are using our international team to understand the dynamics that are affecting the Online Safety Act, which will potentially materialise in the security and resilience world. As a collective leadership team, we look across these issues together. The real value comes from joining the dots. In the current environment, that is where you can make a real difference.
Bradley Thomas
Q
Chung Ching Kwong: The US is probably a good example. It passed Executive order 14028 in May 2021, which requires any software vendor selling to the US federal Government to provide something called a software bill of materials—SBOM. That is technically a table of ingredients, but for software, so you can see exactly what components the software is made of. A lot of the time people who code are quite lazy; they will pull in different components that are available on databases online to form a piece of software that we use. By having vendors provide an SBOM, when anything happens, or whenever any kind of vulnerability is detected, you can very easily find out what happened.
That is due to a hack in 2021, in which a tiny, free piece of code called Log4j was found to have a critical vulnerability. It was buried inside thousands of commercial software products. Without that list of ingredients, it would be very difficult for people who had been using the software to find out, because, first, they may not have the technological capabilities and, secondly, they would not even know if their software had that component. This is one of the things the US is doing to mitigate the risks when it comes to software.
Something that is not entirely in the scope of the Bill but is also worth considering is the US’s Uyghur Forced Labour Prevention Act. That is designed to prevent goods made with forced labour from entering the supply chain. The logic of preventing forced labour is probably something that the UK can consider. Because the US realised that it could not inspect every factory in Xinjiang to prove forced labour, it flipped the script: the law creates a rebuttable presumption that all goods from that region are tainted, so the burden of proof is now on the importer to prove, with clear and convincing evidence, that their supply chain is clean.
A similar logic could be considered when it comes to this Bill to protect cyber-security. Any entities that are co-operating with the PLA—the People’s Liberation Army—for example, should be considered as compromised or non-trustworthy until proven otherwise. That way, you are not waiting until problems happen, when you realise, “Oh, this is actually tainted,” but you prevent it before it happens. That is the comparison that I would make.
Tim Roca
Q
Thank you for speaking to us today. May I turn the conversation a little on its head? We have been talking about national security and the threat from China and others. You were an activist in Hong Kong and made a great deal of effort to fight the Chinese Communist party’s invasion of privacy—privacy violations using the national security law—and other things. Do you see any risk in this legislation as regards civil liberties and privacy? We have had a bit of discussion about how much will go into secondary legislation and how broad the Secretary of State’s powers might be.
Chung Ching Kwong: The threat to privacy, especially to my community—the Hong Kong diaspora community in this country—will be in the fact that, under clause 9, we will be allowing remote access for maintenance, patches, updates and so on. If we are dealing with Chinese vendors and Chinese providers, we will have to allow, under the Bill, certain kinds of remote access for those firms to maintain the operation of software of different infrastructures. As a Hongkonger I would be worrying, because I do not know what kind of tier 2 or tier 3 supplier will have access to all those data, and whether or not they will be transmitted back to China or get into the wrong hands. It will be a worry that our data might fall into the wrong hands. Even though we are not talking specifically about personal data, personal data is definitely in scope. Especially for people with bounties on their head, I imagine that it will be a huge worry that there might be more legitimate access to data than there is right now under the Data Protection Act.
Tim Roca
Q
Chung Ching Kwong: It is always a double-edged sword when it comes to regulating against threats. The more that the Secretary of State or the Government are allowed to go into systems and hold powers to turn off, or take over, certain things, the more there is a risk that those powers will be abused, to a certain extent, or cause harm unintentionally. There is always a balance to be struck between giving more protection to privacy for ordinary users and giving power to the Government so that they can act. Obviously, for critical infrastructure like the power grid and water, the Government need control over those things, but for communications and so on, there is, to a certain extent, a question about what the Government can and cannot do. But personally I do not see a lot of concerns in the Bill.
Emily Darlington
Q
Chung Ching Kwong: It should definitely be covered by the Bill, because if we are not regulating to protect hardware as well, we will get hardware that is already embedded with, for example, an opcode attack. Examples in the context of China include the Lenovo Superfish scandal in 2015, in which originally implemented ad software had hijacked the https certificate, which is there to protect your communication with the website, so that nobody sees what activity is happening between you and the website. Having that Superfish injection made that communication transparent. That was done before the product even came out of the factory. This is not a problem that a software solution can fix. If you were sourcing a Lenovo laptop, for example, the laptop, upon arrival, would be a security breach, and a privacy breach in that sense. We should definitely take it a step further and regulate hardware as well, because a lot of the time that is what state-sponsored attacks target as an attack surface.
Bradley Thomas
Q
Kanishka Narayan: This is a great question. There are two things on my mind. One is that the Government have published a cyber action plan, the crux of which is to make sure that, from the point of view of understanding, principles, accountability and, ultimately, skills, there is significant capability in the public sector. The second thing to say is that we have a very broad-based plan on skills more generally across the cyber sector, public and private. For example, I am really proud of the fact that, through the CyberFirst programme, some—I think—415,000 students right across the country have been upskilled in cyber-security. It is deeply important that the public sector ensures that we are standing up to the test of hiring them and making the attraction of the sector clear to them as well. There is a broad-based plan and a specific one for the public sector in the Government context.
Tim Roca
Q
Kanishka Narayan: That is a great question. Broadly, the Bill takes a risk-based and outcomes-focused approach, rather than a technology-specific one. I think that is the right way to go about it. As we have heard today and beyond, there are some areas where frontier technology—new technology such as AI and quantum, which we talked about earlier today—will pose specific risks. There are other areas where the prevalence of legacy systems and legacy database architectures will present particular risks as well.
The Bill effectively says that the sum total of those systems, in their ultimate impact on the risk exposure of an organisation, is the singular focus where regulators should place their emphasis. I would expect that individual regulators will pay heed to the particular prevalence of legacy systems and technical debt as a source of risk in their particular sectors, and as a result to the mitigations that ought to be placed. I think that being technology agnostic is the right approach in this context.
Lincoln Jopp
Q
Kanishka Narayan: Do you mean operators of essential services, or critical suppliers, as in the third party element?