Cyber Security and Resilience (Network and Information Systems) Bill (Second sitting) Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology
David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

Q Do you know how you would do that information sharing at the moment?

Ian Hulme: As we have already explained, the current regs do not allow us to share the information, which is a bit of a barrier for us. In the future, certainly, we will be working together to try to figure it out. I think that there is also a role for DSIT in that.

Natalie Black: First, we currently have a real problem in that information sharing is much harder than it should be. The Bill makes a big difference in addressing that point, not only among ourselves but with DSIT and NCSC. Secondly, we think that there is an opportunity to improve information reporting, particularly incident reporting, and we would welcome working with DSIT and others—I have mentioned the Digital Regulation Cooperation Forum—to help us find a way to make it easier for industry, because the pace at which we need to move means that we want to ensure that there is no unnecessary rub in the system.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

Q I have a question for Ian Hulme. In your role at the ICO, you are clearly looking at data security. Data is obviously one of the main goals of cyber-attacks. Data issues cut across every sector, and you are looking at a really broad sector of data, from individual identifiers to names, addresses, bank accounts or whatever it might be. This could happen in any sector. How does the Bill give you additional powers to take action, particularly on those co-ordinated through AI or foreign actors, and do you think it is sufficient for what you feel we will be facing in the next five years?

Ian Hulme: We need to think about this as essentially two different regimes. The requirements under data protection legislation to report a data breach are well established, and we have teams, systems and processes that manage all that. There are some notable cases that have been in the public domain in recent months where we have levied fines against organisations for data breaches.

The first thing to realise is that we are still talking about only quite a small sub-sector—digital service providers, including cloud computing service providers, online marketplaces, search engines and, when they are eventually brought into scope, MSPs. A lot of MSPs will provide services for a lot of data controllers so, as I explained, if you have the resilience and security of information networks, that should help to make data more secure in the future.

Lincoln Jopp Portrait Lincoln Jopp (Spelthorne) (Con)
- Hansard - - - Excerpts

Q One of my favourite aphorisms is, “Institutions get the behaviours they reward.” We had a cry from Amazon Web Services this morning about how, when a regulator deals with a company in the event of a cyber-security attack, please remember you are dealing with a victim.

I have dealt with the ICO before. Maybe it was the company that I worked in and led, but there was a culture there that, if you had a data breach, you told the ICO. There was no question about it. How are you going to develop your reactions and the behaviours you reward in order to encourage a set of behaviours and cultures of openness within the corporate sector, bearing in mind that, as was said this morning, by opening that door, companies could be opening themselves up to a hefty fine?

Stuart Okin: In the energy sector, we have that culture. It is one of safety and security, and the chief executives and the heads of security really lean into it and understand that particular space. There are many different forums where they communicate and share that type of information with each other and with us. Incident response is really the purview of DESNZ rather than us, but they will speak to us about that from a regulatory perspective.

Ian Hulme: From the ICO’s perspective, we receive hundreds of data-breach reports. The vast majority of those are dealt with through information and guidance to the impacted organisation. It is only a very small number that go through to enforcement activity, and it is in only the most egregious cases—where failures are so egregious that, from a regulatory perspective, it would be a failure on our part not to take action.

I anticipate that is the approach we will take in the future when dealing with the instant reporting regime that the Bill sets out. Our first instinct would be to collaborate with organisations. Only in the most egregious cases would I imagine that we would look to exercise the full range of our powers.

Natalie Black: From Ofcom’s point of view, we have a long history, particularly in the telecoms sector, of dealing with a whole range of incidents, but I certainly hear your point about the victim. When I have personally dealt with some of these incidents, often you are dealing with a chief executive who has woken up that morning to the fact that they might lose their job and they have very stressed-out teams around them. It is always hard to trust the initial information that is coming out because no one really knows what is going on, certainly for the first few hours, so it is the maturity and experience that we would want to bring to this expanded role when it comes to data centres.

Ultimately the best regulatory relationships I have seen is where there is a lot of trust and openness that a regulator is not going to overreact. They are really going to understand what is going on and are very purposeful about what they are trying to achieve. From Ofcom’s point of view it is always about protecting consumers and citizens, particularly with one eye on security, resilience and economic growth. The experience we have had over the years means that we can come to those conversations with a lot of history, a lot of perspective, and, to be honest, a bit of sympathy because sometimes those moments are very difficult for everyone involved.

--- Later in debate ---
Tim Roca Portrait Tim Roca
- Hansard - - - Excerpts

Q From the other perspective—I am thinking about a UK Government in the future overreaching—do you think there is any risk from this legislation?

Chung Ching Kwong: It is always a double-edged sword when it comes to regulating against threats. The more that the Secretary of State or the Government are allowed to go into systems and hold powers to turn off, or take over, certain things, the more there is a risk that those powers will be abused, to a certain extent, or cause harm unintentionally. There is always a balance to be struck between giving more protection to privacy for ordinary users and giving power to the Government so that they can act. Obviously, for critical infrastructure like the power grid and water, the Government need control over those things, but for communications and so on, there is, to a certain extent, a question about what the Government can and cannot do. But personally I do not see a lot of concerns in the Bill.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Q I want to move from software to hardware that is particularly vulnerable to potential cyber-attack, particularly from the integration of Chinese tech into SIPs, possibly making them vulnerable to cyber-attack by someone who knows the code into those bits of hardware. Should we be doing more to protect against that vulnerability? Should that be covered by the Bill?

Chung Ching Kwong: It should definitely be covered by the Bill, because if we are not regulating to protect hardware as well, we will get hardware that is already embedded with, for example, an opcode attack. Examples in the context of China include the Lenovo Superfish scandal in 2015, in which originally implemented ad software had hijacked the https certificate, which is there to protect your communication with the website, so that nobody sees what activity is happening between you and the website. Having that Superfish injection made that communication transparent. That was done before the product even came out of the factory. This is not a problem that a software solution can fix. If you were sourcing a Lenovo laptop, for example, the laptop, upon arrival, would be a security breach, and a privacy breach in that sense. We should definitely take it a step further and regulate hardware as well, because a lot of the time that is what state-sponsored attacks target as an attack surface.

None Portrait The Chair
- Hansard -

That brings us nicely to the end of the time allotted for the Committee to ask questions. On behalf of the Committee, I thank our witness for her evidence.

Examination of Witness

Professor John Child gave evidence.

--- Later in debate ---
Allison Gardner Portrait Dr Gardner
- Hansard - - - Excerpts

Q I am just thinking that if you are putting liability on someone, you need to make sure that they can apply the regulation in a simple and effective manner and ensure that it is enforced, so they do not carry the full burden of liability.

Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.

Emily Darlington Portrait Emily Darlington
- Hansard - -

Q I note your interest in how the Bill will affect smaller businesses. There is not much detail in the Bill, but how do you think the code of practice could create an environment that lifts everyone’s security up without prescribing too great a burden?

Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”

One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.

David Chadwick Portrait David Chadwick
- Hansard - - - Excerpts

Q We have heard concerns about definitions, particularly regarding incident reporting. What are your observations on the Bill as it stands, and those definitions?

Richard Starnes: Throughout my career, I have been involved in cyber incidents from just about day one. One of the biggest problems that you run into in the first 72 hours, for example, is actually determining whether you have been breached. Just because it looks bad does not mean it is bad. More times than not, you have had indicators of compromise, and you have gone through the entire chain, which has taken you a day, or maybe two or three days, of very diligent work with very clever people to determine that, no, you have not been breached; it was a false positive that was difficult to track down. Do you want to open the door to a regulator coming in and then finding out it is a false positive?

You are also going to have a very significant problem with the amount of alerts that you get with a 24-hour notification requirement, because there is going to be an air of caution, particularly with new legislation. Everybody and his brother is going to be saying, “We think we’ve got a problem.” Alternatively, if they do not, then you have a different issue.