Cyber Security and Resilience (Network and Information Systems) Bill (First sitting) Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Cyber Security and Resilience (Network and Information Systems) Bill (First sitting)

Allison Gardner Excerpts
None Portrait The Chair
- Hansard -

Thank you. I am going to bring Allison Gardner in, because she has been waiting. You have two minutes, Allison.

Allison Gardner Portrait Dr Allison Gardner (Stoke-on-Trent South) (Lab)
- Hansard - -

Q I have a quick question. You mentioned vulnerabilities earlier, and you mentioned, Jen, the complexities of implementing cyber-security plans. As well as technological factors, human factors, not the least of which is the lack of skills, play a key role in cyber-resilience. How would or could the Bill address the human element in cyber-security?

Jen Ellis: That is a great question, and a tricky one. We talk a lot about training and security awareness, and unfortunately I think it becomes yet another tick box: you start a job and watch your little sexual harassment training video, then you watch your cyber-security training video, and probably the former sticks with you better than the latter. I think we have to change that. We have to change that dynamic.

I go back to my last answer, which was that I think one of the strengths of the Bill is that, hopefully, it will enable the regulators to engage much more on this topic and therefore to engage their covered entities much more. That is what we need to see. We need to see the leadership in organisations engage with the topic of cyber-security, not as a chore, as a tick-box exercise or as that headline they read about JLR, but actually as something that matters to their organisation—as something they are going to engage with at a board and executive team level, all the way down through the organisation. Cultural change comes from the top, typically, and we need to see that level of change.

I do not think that there is anything specific in the legislation, as it is currently written, that says, “And this,” in flashing lights, “is going to change the human factors piece.” I think that the devil will be in the detail of the secondary legislation, and then in what the regulators specifically ask for. But there does need to be a general shift in the culture, whereby as sectors generally we start to talk more about this as a requirement. The financial services sector has talked about security for a long time—it has been a reality for it—but I am not sure how true that is, at breadth, in something like the water industry.

I hope that that will change. I hope that we will start to see having those conversations at the top levels, and then all the way down, becoming more of a cultural norm. Unfortunately, you cannot create culture change quickly. When it comes to talking about human factors, it is about people becoming much more aware of it and thinking more about it. That will take time—

None Portrait The Chair
- Hansard -

Order. Thank you very much, but I have to cut you off there.

Jen Ellis: Sorry for taking too long.

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

Q Thank you all very much for making time. I have an implementation-focused question, perhaps directed at Stuart, but open to all. In practice, it would be helpful to understand how frequent is the case that a single company might provide multiple of the possible services in scope: MSP services, cloud hosting, data centre support and cyber-security services. What ability might we have to identify parts of an organisation that are in scope for particular bits and those that are not?

Stuart McKean: You are going to hear the word “complex” a lot in this session. It is hugely complex. I would almost say that everyone likes to dabble. Everyone has little bits of expertise. Certain companies might be cloud-focused, or focused on toolsets; there are a whole range of skillsets. Of course, the larger organisations have multiple teams, multiple scopes and much more credibility in operating in different areas. As that flows down the supply chain, in many cases it becomes more difficult to really unpick the supply chain.

For example, if I am a managed service provider delivering a cloud service from a US hyperscaler, who is responsible? Am I, as the managed service provider, ultimately on the hook, even though I might be using a US-based hyperscaler? That is not just to pick on the hyperscalers, by the way—it could be a US software-based system or a set of tools that I am using. There are a whole range of parts that need to become clearer, because otherwise the managed service community will be saying, “Well, is that my responsibility? Do I have to deliver that?”.

You are then into the legislation side with procurement, because procurement will flow down. Although I might not be in scope directly as a small business, the reality is that the primes and Government Departments that are funding work will flow those requirements down on to the smaller MSPs. Although we might not be in scope directly, when it comes to implementing and meeting the legislation, we will have to follow those rules.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q It is interesting that you mentioned the complexity and skilled teams. Sanjana, you talked about the need for more skill and responsibility, and how distributed responsibility across supply chains is a big deal. That comes down to a duty of care on people who are procuring these things. The annual cyber security breaches survey found that board-level responsibility for cyber has declined in recent years. What explains that, and how could it be improved? As a quick supplementary question, do you think there should be a statutory duty for companies to have a board member responsible for cyber risk? Jill, I will go to you first.

Jill Broom: With the board, historically, cyber has not been viewed as a business risk, but as a technical problem to be addressed by the technical teams, instead of being a valuable, fundamental enabler of your business and a commercial advantage as well, because you are secure and resilient. That has been a problem, historically. It is about changing that culture and thinking about how we get the boards to think about this.

I think a fair amount of work is happening; I know the Government have written to the FTSE 350 companies to ask them to put the cyber governance code of practice into play. That is just to make cyber a board-level responsibility, and also to take account of things such as what they need to do in their supply chain.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q But do you think there should be a statutory duty to have a board member responsible?

Jill Broom: Some of our members have pointed out that the number of organisations under cyber-regulations is very small, and it is only going to increase a small amount with the advent of this particular Bill. Similarly, in the different jurisdictions there are duties at the board level. There is an argument for it. The key thing is that we need to be mindful of it being risk-based, and also that there are organisations that could be disproportionately affected by this. I think it needs a little more testing, particularly with our members, as to whether a statutory requirement is needed.

Bradley Thomas Portrait Bradley Thomas
- Hansard - - - Excerpts

Q Two questions: first, for a bit of context, could the witnesses give us an idea of the objectives of cyber-attacks? Are we seeing objectives based around disruption or around extortion, either monetary or for intellectual property? Perhaps we could have a perspective on whether that differs depending on the origin of the organisation conducting the cyber-attack. Secondly, around the reporting model, is there a view on whether the model proposed in the Bill is beneficial, and whether it risks a fragmented approach, particularly if companies operate in a sector that is regulated under the jurisdiction of two regulators? Do you think that a more universal, singular reporting model would be beneficial in ensuring as strong a response as possible?

Dr Sanjana Mehta: May I weigh in on the second question first? It is good to note that the definition of reportable incident has expanded in the current legislation. One of the concerns that the post-implementation reviews had from the previous regulatory regime was that the regulated entities were under-reporting. We note that the Bill has now expanded the definition to include incidents that could have an adverse impact on the security and operations of network and information systems, in addition to those incidents that are having or have had a negative impact.

While that is clear on the one hand—some factors have been provided, such as the number of customers affected, the geographical reach and the duration of the incident—what is not clear at the moment is the thresholds linked with those factors. In the absence of those thresholds, our concern is that regulated entities may be tempted to over-report rather than under-report, thereby creating more demand on the efforts of the regulators.

We must think about regulatory capacity to deal with all the reports that come through to them, and to understand what might be the trade-offs on the regulated entities, particularly if an entity is regulated by more than one competent authority. For those entities, it would mean reporting to multiple authorities. For organisations that are small or medium-sized enterprises, there is a real concern that the trade-offs may result in procedural compliance over genuine cyber-security and resilience. We call on the Government for immediate clarification of the thresholds linked to those factors.

Jill Broom: I would like to come in on that point. Our members would agree with it. Companies need to be clear about what needs to be reported, when it needs to be reported and where they need to report it. A bit of clarity is required on that, certainly around definitions. As Sanjana said, it is good to see that the definition is expanding, but definitions such as “capable of having” a significant impact remain unclear for industry. Therefore, we need a bit more clarity, because again, it means that we could risk capturing absolutely everything that is out there, and we really want to focus on: what is most important that we need to be aware of? Determining materiality is essential before making any report.

In terms of the where and the how, we are also in favour of a single reporting platform, because that reduces friction around the process, and it allows businesses, ultimately, to know exactly where they are going. They do not need to report here for one regulator and there for another. It is a streamlined process, and it makes the regime as easy as possible to deal with, so it helps incentivise people to act upon it.

I have another point to add about the sequencing of alignment with other potential regulation. We know that, for example, the Government’s ransomware proposals include incident-reporting requirements, and they are expected to come via a different legislative vehicle. We need to be careful not to add any additional layers of complexity or other user journeys into an already complex landscape.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Please, Gentlemen, do not feel obliged to answer each question.

Dr Ian Levy: On the diverse networks and where they are hosted, it is important to be clear that resilience changes as scale changes. When it comes to the statistical model used to talk about resilience for a national system, if you have, say, three physical data centres in the UK connected by a redundant ring, that has a well-understood statistical model, but as you get bigger and bigger and more diverse, the statistics change, so the way you analyse resilience changes. That is not specific to Amazon Web Services; it applies to any large-scale system.

The way that we talk about resilience needs to be thought through carefully. I would urge you to consider outcomes and talk about availability and resilience to particular events. If somebody drives a JCB into a data centre, in a national-scale resilience model that can have a big impact, but in a hyperscale it will not.

We need to be clear about what the regulation is trying to do. If you look at us as a data centre operator, it is very different from someone who is providing co-location services. We provide our data centres for the sole purposes of providing our services, which have a very particular resilience model that is very different from somebody sticking their own racks in a third-party data centre. Some of the terms need to be better defined.

In terms of balancing growth, regulation, oversight and so on, there is a fallacy about putting specific technologies into legislation, except in very specific circumstances. We talked about post-quantum cryptography and AI. They will affect resilience, but probably not in the way we think they will today, so I would caution about putting specific technology definitions on the face of the Bill.

Matt Houlihan: On the cross-border question, very quickly, there are clearly a lot of jurisdictions looking at legislation in this space. There is absolutely an opportunity in the UK to look at things, such as mutual recognition agreements, that would simplify the international regulatory landscape, but there is also the opportunity for the UK to lead in this space as a very well-respected and cyber-capable country.

Touching on getting the balance right on growth and security, we have seen some useful moves recently from the UK Government and previous Governments on looking at codes of practice, which are voluntary in nature but help engage companies, as the recent software security code of practice did with mine and Chris’s. Techniques like that offer a nice balance and engage companies, but get that message around growth absolutely right.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I have so many questions, some of which have been touched on; I will limit myself. I was interested in the CyberUp campaign that you mentioned. What other measures, both legislative and non-legislative, could the UK Government take to enhance the cyber-resilience of the UK’s critical national infrastructure? In terms of resilience, is there any requirement to look a bit more deeply at failsafes and non-technical failsafes that we might need, because we are always going to get that?

My second question is for Ben. In combining AI and cyber, you are combining technologies that come with their own unique risks with cyber-security. I am interested in how you mitigate against that. I am intrigued because, when you talk about AI, I assume you are not talking about straightforward machine learning.

Chris Anley: In terms of what other things we could do, we have talked about voluntary codes. The value of voluntary codes was questioned in an earlier session; but the World Health Organisation best practice guide on handwashing, which is entirely voluntary, saved millions of lives in the recent pandemic. It is important to bear in mind that codes that help you to protect yourself are definitely valuable.

Other actions that are already taking place that we may want to extend on the basis of solid evidence and data are the cyber essentials scheme, for example, and the various codes of practice. The cyber governance code of practice for boards was mentioned earlier, along with the Government outreach and attempting to get boards to recognise that cyber risk is a business risk and an existential threat. We talked about the cyber assessment framework and how that is likely to be the scope within which this Bill is implemented. So, we do not necessarily need to do something new. The scope of the Bill, as we said, is 0.1% of the UK private sector. There is scope to expand the existing things that we are doing, especially cyber essentials, for example, raising the bar for small and medium-sized enterprises across the economy. There is a lot that we are already doing that we could do, that we already have the scope to expand, but obviously that must be done prudently and on the basis of solid evidence.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q Ben, are you combining two risks?

Ben Lyons: That is something we think very deeply about. We see AI as helping to mitigate some of the risks from cyber-security by making it possible to detect attacks more quickly, understand what might be causing them, and to respond at pace. We are an AI native company and we have thought deeply about how to ensure that the technology is both secure and responsible. We are privacy-preserving by design. We take our AI to the organisation’s environment to build an understanding of what normality looks like for them, rather than vast data lakes of customer data. We take a lot of effort to ensure that the information surfaced by AI is interpretable to human beings, so that it is uplifting human professionals and enabling them to do more with the time they have. We are accredited to a range of standards, like ISO 27001 and ISO 42001, which is a standard for AI management. We have released a white paper on how we approach responsible AI in cyber-security, which I would be happy to share with you and give a bit more detail.

Chris Vince Portrait Chris Vince
- Hansard - - - Excerpts

Q Thank you for coming along. Chris has touched on this already, but the Government’s impact assessment of the Bill said that the UK was falling behind its international partners. You all have experience of working globally. Could you comment on that and whether you agree with it?

Matt Houlihan: I am very happy to. Two main comparators come to mind. One is the EU, and we have talked quite a bit about NIS2 and the progress that has made. NIS2 does take a slightly different approach to that of the UK Government, in that it outlines, I think, 18 different sectors, up from seven under NIS1. There is that wide scope in terms of NIS2.

Although NIS2 is an effective piece of legislation, the implementation of it remains patchy over the EU. Something like 19 of the 27 EU member states have implemented it to date in their national laws. There is clearly a bit of work still to do there. There is also some variation in how NIS2 is being implemented, which we feel as an international company operating right across the European Union. As has been touched on briefly, there is now a move, through what are called omnibus proposals, to simplify the reporting requirements and other elements of cyber-security and privacy laws across the EU, which is a welcome step.

I mentioned in a previous answer the work that Australia has been doing, and the Security of Critical Infrastructure Act 2018—SOCI—was genuinely a good standard and has set a good bar for expectations around the world. The Act has rigorous reporting requirements and caveats and guardrails for Government step-in powers. It also covers things like ransomware, which we know the UK Home Office is looking at, and Internet of Things security, which the UK Government recently looked at. Those are probably the two comparators. We hope that the CSRB will take the UK a big step towards that, but as a lot of my colleagues have said, there is a lot of work to do in terms of seeing the guidance and ensuring that it is implemented effectively.

Chris Anley: On the point about where we are perhaps falling behind, with streamlining of reporting we have already mentioned Australia and the EU, which is in progress. On protection of their defenders, other territories are already benefiting from those protections—the EU, the US, and I mentioned Portugal especially. As a third and final point, Australia is an interesting one, as it is providing a cyber-safety net to small and medium-sized enterprises, which provides cyber expertise from the Government to enable smaller entities to get up to code and achieve resilience where those entities lack the personnel and funding.

Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting) Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting)

Allison Gardner Excerpts
Chris Vince Portrait Chris Vince (Harlow) (Lab/Co-op)
- Hansard - - - Excerpts

Q I declare an interest. My father-in-law is Professor Robin Bloomfield, a professor of software and system dependability at City St George’s, University of London, and I have a large data centre in my constituency. My question is probably shorter than that. Why is it important to give regulators flexibility to implement guidance for the sectors they cover?

Stuart Okin: In the energy sector, we tend to use operational technology rather than IT systems. That might mean technology without a screen, so an embedded system. It is therefore important to be able to customise our guidance. We do that today. We use the cyber assessment framework as a baseline, and we have a 335-page overlay on our website to explain how that applies to operational technology in our particular space. It is important to be able to customise accordingly; indeed, we have added physical elements to the cyber assessment framework, which is incredibly important. We welcome that flexibility being maintained in the Bill.

Ian Hulme: Just to contrast with colleagues from Ofcom and Ofgem, ICO’s sector is the whole economy, so it is important that we are able to produce guidance that speaks to all the operators in that sector. Because our sector is much bigger, we currently have something like 550 trust service providers registered, and that will grow significantly with the inclusion of managed service providers. So guidance will be really important to set expectations from a regulatory perspective.

Natalie Black: To round this off, at the end of the day we always have to come back to the problem we are trying to solve, which is ensuring cyber-security and resilience. As you will have heard from many others today, cyber is a threat that is always evolving. The idea that we can have a stagnant approach is for the birds. We need to be flexible as regulators. We need to evolve and adapt to the threat, and to the different operators we will engage with over the next couple of years. Collectively, we all appreciate that flexibility.

Allison Gardner Portrait Dr Allison Gardner (Stoke-on-Trent South) (Lab)
- Hansard - -

Q I should point out that I once worked for the NHS AI and Digital Regulations Service and have also worked for a number of different regulators, including the ICO, so I have experience of the joys and frustrations of cross-regulatory working. We have heard evidence of the challenges experienced by businesses when they have to go to different regulators—I think it is as many as 14—and deal with the conflicting guidance they are often given and the skillset within each regulator. There were calls for one portal for incident reporting.

The ICO is a horizontal regulator working across all sectors. In your experience, would a single cyber regulator be a good idea? What would be the benefits and the challenges? I will allow Ofcom and Ofgem to jump in and defend themselves.

Ian Hulme: I suppose the challenge with having a single regulator is that—like ourselves, as a whole-economy regulator—it will have to prioritise and direct its resources at the issues of highest harm and risk. One benefit of a sectoral approach is that we understand our sectors at a deeper level; we certainly work together quite closely on a whole range of issues, and my teams have been working with Natalie and Stuart’s teams on the Bill over the last 18 months, and thinking about how we can collaborate better and co-ordinate our activities. It is really pleasing to see that that has been recognised in the Bill with the provisions for information sharing. That is going to be key, because the lack of information-sharing provisions in the current regs has been a bit of a hindrance. There are pros and cons, but a single regulator will need to prioritise its resources, so you may not get the coverage you might with a sectoral approach.

Natalie Black: Having worked in this area for quite some time, I would add that the challenge with a single regulator is that you end up with a race to the bottom, and minimum standards you can apply everywhere. However, with a tailored approach, you can recognise the complexity of the cyber risk and the opportunity to target specific issues—for example, prepositioning and ransomware. That said, we absolutely recognise the challenge for operators and companies in having to bounce between regulators. We hear it all the time, and you will see a real commitment from us to do something about it.

Some of that needs to sit with the Department for Science, Innovation and Technology, which is getting a lot of feedback from all of us about how we need it to co-ordinate and make things as easy as possible for companies—many of which are important investors in our economy, and we absolutely recognise that. We are also doing our bit through the UK Regulators Network and the Digital Regulation Cooperation Forum to find the low-hanging fruit where we can make a difference. To give a tangible example, we think there should be a way to do single reporting of incidents. We do not have the answer for that yet, but that is something we are exploring to try and make companies’ lives easier. To be honest, it will make our lives easier as well, because it wastes our time having to co-ordinate across multiple operators.

Bradley Thomas Portrait Bradley Thomas (Bromsgrove) (Con)
- Hansard - - - Excerpts

Q What additional resources will you need in order to implement and enforce the requirements of the Bill?

Ian Hulme: Again, to contrast the ICO’s position with that of other colleagues, we have a much larger sector, as it currently exists, and we will have a massively larger sector again in the future. We are also funded slightly differently. The ICO is grant in aid funded from Government, so we are dependent on Government support.

To move from a reactive footing, which is our position at the moment—that is the Government’s guidance to competent authorities and to the ICO specifically—to a proactive footing with a much expanded sector, will need significant uplift in our skills and capability, as well as system development in order to register and ingest intelligence from MSPs and relevant digital service providers in the future.

From our perspective at the ICO, we need significant support from DSIT so that we can transition into the new regulatory regime. It will ultimately be self-funding—it is a sustainable model—but we need continued support during the transition period.

--- Later in debate ---
Sarah Russell Portrait Sarah Russell
- Hansard - - - Excerpts

Q Professor Child, I note that you are very supportive of legal reform in quite a number of areas. With emphasis on the Computer Misuse Act, surely the reality is that the Crown Prosecution Service will never conclude that it is in the best interests of the country to prosecute any of the behaviours that people are concerned about, which we recognise as positive and helpful. Is there a need for legal reform?

Professor John Child: Yes. It is not the easiest criminal law tale, if you like. If there were a problem of overcriminalisation in the sense of prosecutions, penalisation, high sentences and so on, the solution would be to look at a whole range of options, including prosecutorial discretion, sentencing or whatever it might be, to try to solve that problem. That is not the problem under the status quo. The current problem is purely the original point of criminalisation. Think of an industry carrying out potentially criminalised activity. Even if no one is going to be prosecuted, the chilling effect is that either the work is not done or it is done under the veil of potential criminalisation, which leads to pretty obvious problems in terms of insurance for that kind of industry, the professionalisation of the industry and making sure that reporting mechanisms are accurate.

We have sat through many meetings with the CPS and those within the cyber-security industry who say that the channels of communication—that back and forth of reporting—is vital. However, a necessary step before that communication can happen is the decriminalisation of basic practices. No industry can effectively be told on the one hand, “What you are doing is vital,” but on the other, “It is a criminal offence, and we would like you to document it and report it to us in an itemised fashion over a period of time.” It is just not a realistic relationship to engender.

The cyber-security industry has evolved in a fragmented way both nationally and internationally, and the only way to get those professionalisation and cyber-resilience pay-offs is by recognising that the criminal law is a barrier—not because it is prosecuting or sentencing, but because of its very existence. It does not allow individuals to say, “If, heaven forbid, I were prosecuted, I can explain that what I was doing was nationally important. That is the basis on which I should not be convicted, not because of the good will of a prosecutor.”

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I have a couple of unconnected questions. We have asked a couple of times whether senior board members should have legal, statutory responsibility for cyber. The pros are that it is not seen as a priority, and culture change has to be top-down. However, there are issues with smaller companies bearing a responsibility that is diffused along the supply chain. Also, boards that tend to have a focus on providing returns for shareholders may not be investing in this complex arena. I am interested in your thoughts on whether the Bill does enough to make senior executives responsible for their organisations’ cyber-security.

Professor John Child: I think the Bill does a lot of things quite effectively. It modernises in a sensible way and it allows for the recognition of change in type of threat. This goes back to my criminalisation point. Crucially, it also allows modernisation and flexibility to move through into secondary legislation, rather than us relying purely on the maturations of primary legislation.

In terms of board-level responsibility, I cannot speak too authoritatively on the civil law aspects, but drawing on my criminal law background, there is something in that as well. At the moment, the potential for criminalisation applies very much to those making unauthorised access to another person’s system. That is the way the criminal law works. We also have potential for corporate liability that can lead all the way up to board rooms, but only if you have a directing mind—so only if a board member is directing that specific activity, which is unlikely, apart from in very small companies.

You can have a legal regime that says, whether through accreditation or simple public interest offences, that there are certain activities that involve unauthorised access to another person’s system, which may be legitimate or indeed necessary. However, we want a professional culture within that; we do not want that outsourced to individuals around the world. You can then build in sensible corporate liability based on consent or connivance, which goes to individuals in the boardroom, or a failure-to-prevent model of criminalisation, which is more popular when it comes to financial crimes. That is where you say, “If this exists in your sector, as an industry and as a company, you can be potentially liable as an entity if you do not make sure these powers are used responsibly, and if you essentially outsource to individuals in order to avoid personal liabilities”.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q Thank you—that was quite detailed. I have a very quick question: what measures would you want the Government to take to enhance the cyber-resilience of the UK’s critical national infrastructure? I am interested in your thoughts on requirements for failsafes and risk management, and indeed on the non-technical resilience measures that would be needed in case of complete failure.

Professor John Child: Again, I have to draw back to the criminal law aspects. I think the Bill does the things it needs to do well; certainly, from the conversations I have had with those in cyber-security and so on, these are welcome steps in the right direction.

However, when you look at critical national infrastructure, although you can create layers of civil responsibility and regulation—which is entirely sensible—most of that will filter down to individuals doing cyber-security and resilience work. It is about empowering those individuals; within a state apparatus, that is one thing, but even with regulators and in-house cyber-security experts, individuals are working only within the confines of what they are allowed to do under the criminal law, as well as the civil regulatory system.

The reason I have been asked here, and what a lot of my work has focused on, is this: if you filter responsibility down to individuals doing security work for national as well as commercial infrastructure, you need to empower them to do that work effectively. The current law does not do that; it creates the problem of either doing that work under the veil of criminalisation, or not doing it, with work being outsourced to places where you do not have the back-and-forth communication and reporting regime you would need.

Allison Gardner Portrait Dr Gardner
- Hansard - -

I think you are touching on the old problem of where liability lies when you have this long supply chain of diffused responsibility, but thank you.

Dave Robertson Portrait Dave Robertson
- Hansard - - - Excerpts

Q Thank you, Professor, for coming along. You said that when the Computer Misuse Act was written in 1990, not many people were doing cyber-security work. You attested that the criminalisation element was negative for a number of reasons. Obviously, since then, a private sector has grown up in this area. I am struggling to marry those two pieces of information together. Can you give us an impression of other jurisdictions and of international comparators where things may be different, and whether they have been able to get ahead of us in building a more thriving sector? Are we particularly lagging behind in the OECD? Are other countries ahead of us because they do not have the measures we do?

Professor John Child: That is a good question. It is certainly fair to say that all jurisdictions are somewhat in flux about how to deal with cyber threats, which are mushrooming in ways people would not have expected—certainly not in 1990, but even many years after.

The various international conventions—the OECD, the Budapest convention and so on—require regulation and criminalisation, but those are not nearly as wide as the blanket approach that was taken in this country. Some comparative civil law jurisdictions in the rest of Europe start from a slightly different place, in that they did not necessarily take the maximalist approach to criminalisation we did.

In a number of jurisdictions, you do not have direct criminalisation of all activities, regardless of the intention of the actor, in the same way that we do. So we are starting from a slightly different position. Having said that, we do see a number of jurisdictions making positive strides in this direction, because they need to; indeed, we see that at European Union level as well, where directives are being created to target this area of concern.

There are a few examples. We wrote a comparative report, incidentally, which is openly available. In terms of some highlights from that, there is a provision in French law, for example, where, despite mandatory prosecution being the general model within French criminal law, there is a carve-out relating to cyber-security and legitimate actors, where there is not the same requirement to prosecute. In the Netherlands, there was a scandal around hacking of keycards for public transport. That was done for responsible reasons, and there was a backlash in relation to prosecution there. There were measures taken in terms of prosecutorial discretion. Most recently, in Portugal, we saw a specific cyber-security defence created within the criminal law just last year.

In the US, it varies between states. In a lot of states, you have quite an unhelpful debate between minimalist and maximalist positions, where they either want to have complete hack-back on the one hand or no action at all on the other, but you have a slightly more tolerant regime in terms of prosecution.

So there are varying degrees, but certainly that is the direction of travel. For sensible, criminal law reasons that I would speak to, as well as the commercial benefits that come with a sector that is allowed to do its work properly, and the security benefits, that is certainly the direction of travel.

--- Later in debate ---
David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

Q Thank you for joining us. Reporting of several recent cyber-attacks has one thing in common: there were often insufficient security measures in place. British Airways in 2018 is just one example. Reportedly, the average tenure of a chief information security officer is 18 months. From your perspective, what do CISOs need from the Bill to help strengthen their hand when they are saying to a board, “This is what I need to do to keep our organisation secure”?

Richard Starnes: On what you say about the 18-month tenure, one of the problems is stress. A lot of CISOs are burning out and moving to companies that they consider to have boards that are more receptive to what they do for a living. Some companies get it. Some companies support the CISOs, and maybe have them reporting to a parallel to the CIO, or chief information officer. A big discussion among CISOs is that having a CISO reporting to a CIO is a conflict of interest. A CISO is essentially a governance position, so you wind up having to govern your boss, which I would submit is a bit of a challenge.

How do we help CISOs? First, with stringent application of regulatory instruments. We should also look at or discuss the idea of having C-level or board-level executives specifically liable for not doing proper risk governance of cyber-security—that is something that I think needs to be discussed. Section 172 of the Companies Act 2006 states that you must act in the best interests of your company. In this day and age, I would submit that not addressing cyber-risk is a direct attack on your bottom line.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q You have answered the question I was about to ask. I may ask an addendum to that, but first I want to clarify something. If you put liability on an individual board member, that is going to cause problems. Do you think that there should be a statutory responsibility for the company to have a board member responsible for cyber-risk, and that the responsibility and accountability should sit at company level?

Richard Starnes: I think this should flow from the board to the C-level executives. Most boards have a risk committee of some sort, and I think the chair of the risk committee would be a natural place for that responsibility to sit, but there has to be somebody who is ultimately responsible. If the board does not take it seriously, the C-levels will not, and if the C-levels will not, the rest of the company will not.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q You mentioned stringent application of the regulatory regime. Could you explain the reasons for the lack of enforcement under the current NIS guidelines? Do you feel that the regulatory regime should be streamlined?

Richard Starnes: That is a very broad question.

Allison Gardner Portrait Dr Gardner
- Hansard - -

I know, sorry. I collapsed it down from quite a few.

Richard Starnes: There is any number of different reasons. You have 12 competent authorities, at last count, with varying funding models and access to talent. Those could vary quite a bit, depending on those factors. I am not really sure how to answer that question.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I am just thinking that if you are putting liability on someone, you need to make sure that they can apply the regulation in a simple and effective manner and ensure that it is enforced, so they do not carry the full burden of liability.

Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

Q I note your interest in how the Bill will affect smaller businesses. There is not much detail in the Bill, but how do you think the code of practice could create an environment that lifts everyone’s security up without prescribing too great a burden?

Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”

One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.

--- Later in debate ---
Chris Vince Portrait Chris Vince
- Hansard - - - Excerpts

Q Thank you for joining us remotely from Scotland. I have a question for Stewart about data protection. In my Harlow constituency we have just got a new electronic patient registration scheme; what risks do you see in the increased use of technology like that in the NHS? Does the Bill help to address some of the risks?

Stewart Whyte: Anything that increases or improves our processes in the NHS for a lot of the procured services that we take in, and anything that is going to strengthen the framework between the health board or health service and the suppliers, is welcome for me. One of our problems in the NHS is that the systems we put in are becoming more and more complex. Being able to risk assess them against a particular framework would certainly help from our perspective. A lot of our suppliers, and a lot of our systems and processes, are procured from elsewhere, so we are looking for anything at all within the health service that will improve the process and the links with third party service providers.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I am interested in who you report to should you identify a cyber-incident. I am talking about not just data breaches but wider ones that can affect operational systems. Which regulators do you deal with? If it is multiple regulators, do you feel there is a case for having one distinct regulator to cover cyber-resilience and manage that quite difficult landscape?

Brian Miller: That is a great question. I will touch on some different parts, because I might have slightly different information from some of the information you have heard previously. On reporting—Stewart will deal with the data protection element for reporting into the Information Commissioner’s Office—we report to the Scottish Health Competent Authority. It is important that we have an excellent relationship with the people there. To put that in context, I was speaking to them yesterday regarding our transition to the CAF, as part of our new compliance for NHS Greater Glasgow and Clyde. If there was a reportable incident, we would report into the SHCA. The thresholds are really well defined against the confidentiality, integrity and availability triad—it will be patient impact and stuff like that.

Organisationally, we report up the chain to our director of digital services, and we have an information governance steering group. Our senior information risk officer is the director of digital, and the chief information security officer role sits with our director of digital. We report nationally, and we work really closely with National Services Scotland’s Cyber Security Centre of Excellence, which does a lot of our threat protection and secure operations, 24/7, 365 days a year. We work with the Scottish Government through the Scottish Cyber Co-ordination Centre and what are called CREW—cyber resilience early warning—notices for a lot of threat intelligence. If something met the threshold, we would report to the SHCA. Stewart, do you want to come in on the data protection officer?

Stewart Whyte: We would report to the Information Commissioner, and within 72 hours we also report to the Scottish Government information governance and data protection team. We would risk assess the breaches and determine whether they meet the threshold for reporting. Not every data breach is required to be reported.

From the reporting perspective, it would be helpful to report into one individual organisation. I noticed that in the reporting requirements we are looking at doing it within 24 hours, which could be quite difficult, because sometimes we do not know everything about the breach within that time. We might need more information to be able to risk assess it appropriately. Making regulators aware of the breach as soon as possible is always going to be a good thing.

Lincoln Jopp Portrait Lincoln Jopp
- Hansard - - - Excerpts

Q To come back to Dr Spencer’s original question about the scope of the legislation, the current situation, as I understand it, is that there is a carve-out for small and medium-sized enterprises because we do not want to put too much regulatory burden on them, but, under the new proposed legislation, operators of essential services that are SMEs will be designated by their regulator. That brings us back to the question of which regulator that would be. Do you currently use that designation for operators of essential services, or would you have to do a piece of work, presumably looking at a number of different regulators’ points of view, to designate the operators of essential services?

Brian Miller: We would work with the Scottish Health Competent Authority as our regulator; I cannot speak for other regulators and what that might look like. We are doing work on what assurance for critical suppliers outside the Bill looks like just now, and we are working across the boards in Scotland on identifying critical suppliers. Outside of that, for any suppliers or any new services, we will assess the risk individually, based on the services they are providing.

The Bill is really valuable for me, particularly when it comes to managed service provision. One of the questions I was looking at is: what has changed since 2018? The biggest change for me is that identity has went to the cloud, because of video conferencing and stuff like that. When identity went to the cloud, it then involved managed service providers and data centres. We have put additional controls around that, because the network perimeter extended out into the cloud. We might want to take advantage of those controls for new things that come online, integrating with national identity, but we need to be assured that the companies integrating with national identity are safe. For me, the Bill will be a terrific bit of legislation that will help me with that—if that makes sense.

--- Later in debate ---
None Portrait The Chair
- Hansard -

You don’t? Okay, I call Allison Gardner.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I have loads. Before I come to the question I was going to ask, I want to pick you up on the worry about information sharing. I have worked across regulators, and they seemed to be really confident about information sharing, but I know that is not always the case. There is some protection of turf, and other Acts might prohibit that information sharing. Could you expand on that area of concern? What would be your recommendation?

Carla Baker: My comment on information sharing was about what else the Government could do. It was not necessarily specifically to do with the Bill. If you want me to elaborate on the wider issue of information sharing, I am happy to.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Particularly between regulators, and how that would work.

Carla Baker: I cannot necessarily talk in much detail about information sharing across regulators. It is more about information sharing across the technology industry that I can talk about.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q Okay, I am glad I clarified, because that is quite interesting.

I will ask my actual question, and I am trying to get my head around this. You recommend mandating that company boards be accountable for mitigating cyber-risks, and as we know from the annual cyber-security breaches survey, there are declining levels of board responsibility for cyber in recent years, which links to whether there should be a statutory duty. I am a little worried about small and microbusinesses having to deal with that regulatory burden, especially if they are designated as critical suppliers. I am trying to marry those two things together, and the concern of where liability sits, because we are very dependent on service providers. I do not know if that makes any sense to you, but could you clarify my thinking?

Chris Parker: It is a concern. I will start off with a small point about where there is a statutory requirement, certainly for large companies. I personally believe—and I am pretty sure that most industry people I speak to would say this—that it would be very surprising if we did not have cyber-focused people on boards and in much bigger governance, as we would in a financial services company, where people who are expert in financial risk are able to govern appropriately. As we get smaller and smaller in scale, that is much harder to do.

The good news is that there are some brilliant—and I really mean that—resources available from probably the most underused website in the world, but the best one, which is the National Cyber Security Centre website. It has some outstanding advice for boards and governance on there. You can effectively make a pack and write a checklist, even if you are a very small company with a board of two people, and go through your own things and make sure your checklists are there.

The data and the capability are there to give support. Whether it is signposted enough, and whether we are helping on a local level, to make sure that people are aware of those things is perhaps something we could do better at in this country. But I am sure that industry will do our part, and we do, to share and reinforce the good sharing of things like that website, to guide good governance for SMEs especially.

Carla Baker: That board-level accountability is really important, and it is crucial for cyber-security. I think it is getting better—from the senior execs that I speak to in industry, there is more understanding—but generally speaking, there is a view that cyber-security is an IT issue, not a business issue. I am sure you have heard throughout the day about understanding the risks we have seen around vulnerabilities, and the incidents that have affected the retail or manufacturing sectors. Those are substantial incidents that have impacted the UK and have systemic knock-on effects. Organisations have to understand the serious nature of cyber-security, and therefore put more emphasis on cyber at the board level.

Should we be mandating board-level governance? That is useful for this debate to seek information and input on, but the burden on SMEs has to be risk-based and proportionate, however it is framed.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q Very quickly—I apologise if I am taking too much time—accountability is slightly different from liability. In the case of a cyber-breach that has caused harm, where would you see the liability lying?

Chris Parker: That is a harder question. There is precedent here—of course, we can think back to the precedents that this great building has set on allowing things such as, post-Clapham train disaster, the Corporate Manslaughter and Corporate Homicide Act 2007 putting it very firmly on boards, evolving from the Health and Safety at Work etc. Act 1974. We are not there yet, but do not forget that we are starting to legislate, as is everyone else in Europe and America who are on this journey.

I believe that we will see a requirement at some point in the future. We all hope that the requirement is not driven by something terrible, but is driven by sensible, wise methodology through which we can find out how we can ensure that people are liable and accept their liability. We have seen statements stood up on health and safety from CEOs at every office in this country, for good reason, and that sort of evolution may well be the next phase.

Carla and I talk about this a lot, but we have to be careful about how much we put into this Bill. We have to get the important bit about critical national infrastructure under way, and then we can address it all collaboratively at the next stage to deal with very important issues such as that.

Lincoln Jopp Portrait Lincoln Jopp
- Hansard - - - Excerpts

Q I want to come back to that point. Chris, you said something like, “SMEs find it very difficult, if not impossible, to bear the regulatory burden, so we have to be very careful when designating SMEs as operators of essential services.” To me, that says that you think the Bill, as currently drafted, will place too much of a regulatory burden on SMEs. Is that correct?

Chris Parker: I was referring to strategic and critical suppliers, which is a list of Government suppliers. We are advocating that the level of governance and regulatory requirement inside an organisation is difficult, and it really is. It requires quite a lot of work and resource, and if we are putting that on to too small a supplier, on the basis that we think it is on the critical path, I would advocate a different system for risk management of that organisation, rather than it being in the regulatory scope of a cyber-resilience Bill. The critical suppliers should be the larger companies. If we start that way in legislation and then work down—the Bill is designed to be flexible, which is excellent—we can try to get that way.

As a last point on flexibility—this is perhaps very obvious to us but less so to people who are less aware of the Bill—there is a huge dynamic going on here where you have a continuum, a line, at one end of which you have the need for clarity, which comes from business. At the other you have a need for flexibility, which quite rightly comes from the Government, who want to adjust and adapt quite quickly to secure the population, society and the economy against a changing threat. That continuum has an opposing dynamic, so the CRB has a big challenge. We must therefore not be too hard on ourselves in finding exactly where to be on that line. Some things will go well, and some will just need to be looked at after a few years of practice—I really believe that. We are not going to get it all right, because of the complexities and different dynamics along that line.

Carla Baker: This debate about whether SMEs should be involved or regulated in this space has been around since we were discussing GDPR back in 2018. It comes down to the systemic nature of the supplier. You can look at the designation of critical dependencies. I am sure you have talked about this, but for example, an SME software company selling to an energy company could be deemed a critical supplier by a regulator, and it is then brought into scope. However, I think it should be the SMEs that are relevant to the whole sector, not just to one organisation. If they are systemic and integral to a number of different sectors, or a number of different organisations within a sector, it is fair enough that they are potentially brought into scope.

It is that risk-based approach again. But if it is just one supplier, one SME, that is selling to one energy company up in the north of England, is it risk-based and proportionate that they are brought into scope? I think that is debatable.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Dr Allison Gardner, you have two minutes.

Allison Gardner Portrait Dr Gardner
- Hansard - -

Q I will be quick. Much of my question was already asked. I will just say that proportionality is a known principle within regulation and I take that into account. I want to push on an issue that was raised. When you are dealing with different regulators with a cross-regulatory theme, you often get conflicting guidelines. It is a big headache for people. Again, you get the gaps and the duplication. To ensure my understanding, who will oversee making sure that the regulators align with each other to make it easier for people working within the sectors? Otherwise, they will go to one regulator and it will say one thing, and another will say another thing.

Kanishka Narayan: It is an important point. We know that the quality of current regulation for cyber-security varies across regulators. As an earlier panellist said, there is virtue in the fact that we have not set an effective cap on where regulators can go by having a single standard. At the same time, we need to make sure that we are raising a consistent floor of quality and proportionality judgments.

First, there is obviously constant oversight of each regulator through the lead Departments. In my case, for example, we consistently engage with Ofcom on a range of areas, including this one, to ensure the quality of regulation and that proportionality judgment is appropriately applied. Secondly, there is a clear commitment in the Bill for the Secretary of State to report back, on a five-year basis, on the overall implementation of the regime proposed in the Bill. That will be when we can get a global view of how the whole system is working.

None Portrait The Chair
- Hansard -

That brings us to the end of the time allotted for the Committee to ask questions, and to the end of the sitting. On behalf of the Committee, I thank the Minister for his evidence.

Ordered, That further consideration be now adjourned. —(Taiwo Owatemi.)