(5 days, 19 hours ago)
Public Bill Committees
The Parliamentary Under-Secretary of State for Science, Innovation and Technology (Kanishka Narayan)
Q
Jen Ellis: Again, that is a hugely complex question to cover in a short amount of the time. One of the challenges that we face in UK is that we are a 99% small and mediums economy. It is hard to think about how to place more burdens on small and medium businesses, what they can reasonably get done and what resources are available. That said, that is the problem that we have to deal with; we have to figure out how to make progress.
There is also a challenge here, in that we tend to focus a lot on the behaviour of the victim. It is understandable why—that is the side that we can control—but we are missing the middle piece. There are the bad guys, who we cannot control but who we can try to prosecute and bring to task; and there are the victims, who we can control, and we focus a lot on that—CSRB focuses on that side. Then there is the middle ground of enablers. They are not intending to be enablers, but they are the people who are creating the platforms, mediums and technology. I am not sure that we are where we could be in thinking about how to set a baseline for them. We have a lot of voluntary codes, which is fantastic—that is a really good starting point—but it is about the value of the voluntary and how much it requires behavioural change. What you see is that the organisations that are already doing well and taking security seriously are following the voluntary codes because they were already investing, but there is a really long tail of organisations that are not.
Any policy approach, legislation or otherwise, comes down to the fact that you can build the best thing in the world, but you need a plan for adoption or the engagement piece—what it looks like to go into communities and see how people are wrestling with this stuff and the challenges that are blocking adoption. You also need to think about how to address and remove those challenges, and, where necessary, how to ensure appropriate enforcement, accountability and transparency. That is critical, and I am not sure that we see a huge amount of that at the moment. That is an area where there is potential for growth.
With CSRB, the piece around enforcement is going to be critical, and not just for the covered entities. We are also giving new authorities to the regulators, so what are we doing to say to them, “We expect you to use them, to be accountable for using them and to demonstrate that your sector is improving”? There needs to be stronger conversations about what it looks like to not meet the requirements. We should be looking more broadly, beyond just telling small companies to do more. If we are going to tell small companies to do more, how do we make it something that they can prioritise, care about and take seriously, in the same way that health and safety is taken seriously?
David Cook: To achieve the outcome in question, which is about the practicalities of a supply chain where smaller entities are relying on it, I can see the benefit of bringing those small entities in scope, but there could be something rather more forthright in the legislation on how the supply chain is dealt with on a contractual basis. In reality, we see that when a smaller entity tries to contract with a much larger entity—an IT outsourced provider, for example—it may find pushback if the contractual terms that it asks for would help it but are not required under legislation.
Where an organisation can rely on the GDPR, which has very specific requirements as to what contracts should contain, or the Digital Operational Resilience Act, which is a European financial services law and is very prescriptive as to what a contract must contain, any kind of entity doing deals and entering into a contract cannot really push back, because the requirements are set out in stone. The Bill does not have a similar requirement as to what a contract with providers might look like.
Pushing that requirement into the negotiation between, for example, a massive global IT outsourced provider and a much smaller entity means either that we will see piecemeal clauses that do not always achieve the outcomes you are after, or that we will not see those clauses in place at all because of the commercial reality. Having a similarly prescriptive set of requirements for what that contract would contain means that anybody negotiating could point to the law and say, “We have to have this in place, and there’s no wriggle room.” That would achieve the outcome you are after: those small entities would all have identical contracts, at least as a baseline.
Emily Darlington (Milton Keynes Central) (Lab)
Q
David Cook: The original NIS regulations came out of a directive from 2016, so this is 10 years old now, and the world changes quickly, especially when it comes to technology. Not only is this supply chain vulnerability systemic, but it causes a significant risk to UK and global businesses. Ransomware groups, threat actors or cyber-criminals—however you want to badge that—are looking for a one-to-many model. Rather than going after each organisation piecemeal, if they can find a route through one organisation that leads to millions, they will always follow it. At the moment, they are out of scope.
The reality is that those organisations, which are global in nature, often do not pay due regard to UK law because they are acting all over the world and we are one of many jurisdictions. They are the threat vector that is allowing an attack into an organisation, but it then sits with the organisations that are attacked to deal with the fallout. Often, although they do not get away scot-free, they are outside legislative scrutiny and can carry on operating as they did before. That causes a vulnerability. The one-to-many attack route is a vulnerability, and at the moment the law is lacking in how it is equipped to deal with the fallout.
Jen Ellis: In terms of what the landscape looks like, our dialogue often has a huge focus on cyber-crime and we look a lot at data protection and that kind of thing. Last year, we saw the impact of disruptive attacks, but in the past few years we have also heard a lot more about state-sponsored attacks.
I do not know how familiar everyone in the room is with Volt Typhoon and Salt Typhoon; they were widespread nation-state attacks that were uncovered in the US. We are not immune to such attacks; we could just as easily fall victim to them. We should take the discovery of Volt Typhoon as a massive wake-up call to the fact that although we are aware of the challenge, we are not moving fast enough to address it. Volt Typhoon particularly targeted US critical infrastructure, with a view to being able to massively disrupt it at scale should a reason to do so arise. We cannot have that level of disruption across our society; the impacts would be catastrophic.
Part of what NIS is doing and what the CSRB is looking to do is to take NIS and update it to make sure that it is covering the relevant things, but I also hope that we will see a new level of urgency and an understanding that the risks are very prevalent and are coming from different sources with all sorts of different motivations. There is huge complexity, which David has spoken to, around the supply chain. We really need to see the critical infrastructure and the core service providers becoming hugely more vigilant and taking their role as providers of a critical service very seriously when it comes to security. They need to think about what they are doing to be part of the solution and to harden and protect the UK against outside interference.
David Cook: By way of example, NIS1 talks about reporting to the regulator if there is a significant impact. What we are seeing with some of the attacks that Jen has spoken about is pre-positioning, whereby a criminal or a threat actor sits on the network and the environment and waits for the day when they are going to push the big red button and cause an attack. That is outside NIS1: if that sort of issue were identified, it would not be reportable to the regulator. The regulator would therefore not have any visibility of it.
NIS2 and the Bill talk about something being identified that is caused by or is capable of causing severe operational disruption. It widens the ambit of visibility and allows the UK state, as well as regulators, to understand what is going in the environment more broadly, because if there are trends—if a number of organisations report to a regulator that they have found that pre-positioning—they know that a malicious actor is planning something. The footprints are there.
Freddie van Mierlo (Henley and Thame) (LD)
Q
Jen Ellis: You have covered a lot of territory there; I will try to break it down. If you look at the attacks last year, all the companies you mentioned were investing in cyber-security. There is a difficulty here, because there is no such thing as being bullet-proof or secure. You are always trying to raise the barriers as high as you can and make it harder for attackers to be successful. The three attacks you mentioned were highly targeted attacks. The example of Volt Typhoon in the US was also highly targeted. These are attackers who are highly motivated to go after specific entities and who will keep going until they get somewhere. It is really hard to defend against stuff like that. What you are trying to do is remove the chances of all the opportunistic stuff happening.
So, first, we are not going to become secure as such, but we are trying to minimise the risk as much as possible. Secondly, it is really complex to do it; we saw last year the examples of companies that, even though they had invested, still missed some things. Even in the discussions that they had had around cyber-insurance, they had massively underestimated the cost of the level of disruption that they experienced. Part of it is that we are still trying to figure out how things will happen, what the impacts will be and what that will look like in the long term.
There is also a long tail of companies that are not investing, or not investing enough. Hopefully, this legislation will help with that, but more importantly, you want to see regulators engaging on the issue, talking to the entities they cover and going on a journey with them to understand what the risks are and where they need to get to. If you are talking about critical providers and essential services, it is really hard for an organisation—in its own mind or in being answerable to its board or investors—to justify spend on cyber-security. If you are a hospital saying that you are putting money towards security programmes rather than beds or diagnostics, that is an incredibly difficult conversation to have. One of the good things about CSRB, hopefully, is that it will legitimise choices and conversations in which people say, “Investing time and resources into cyber-security is investing time and resources into providing a critical, essential service, and it is okay to make those pay-off choices—they have to be made.”
Part of it is that when you are running an organisation, it is so hard to think about all the different elements. The problem with cyber-security—we need to be clear about this—is that with a lot of things that we ask organisations to do, you say, “You have to make this investment to get to this point,” and then you move on. So they might take a loan, the Government might help them in some way, or they might deprioritise other spending for a set period so that they can go and invest in something, get up to date on something or build out something; then they are done, and they can move back to a normal operating state.
Security is not that. It is expensive, complex and multifaceted. We are asking organisations of all sizes in the UK, many of which are not large, to invest in perpetuity. We are asking them to increase investment over time and build maturity. That is not a small ask, so we need to understand that there are very reasonable dynamics at play here that mean that we are not where we need to be. At the same time, we need a lot more urgency and focus. It is really important to get the regulators engaged; get them to prioritise this; have them work with their sectors, bring their sectors along and build that maturity; and legitimise the investment of time and resources for critical infrastructure.
Chris Vince
Q
Matt Houlihan: I am very happy to. Two main comparators come to mind. One is the EU, and we have talked quite a bit about NIS2 and the progress that has made. NIS2 does take a slightly different approach to that of the UK Government, in that it outlines, I think, 18 different sectors, up from seven under NIS1. There is that wide scope in terms of NIS2.
Although NIS2 is an effective piece of legislation, the implementation of it remains patchy over the EU. Something like 19 of the 27 EU member states have implemented it to date in their national laws. There is clearly a bit of work still to do there. There is also some variation in how NIS2 is being implemented, which we feel as an international company operating right across the European Union. As has been touched on briefly, there is now a move, through what are called omnibus proposals, to simplify the reporting requirements and other elements of cyber-security and privacy laws across the EU, which is a welcome step.
I mentioned in a previous answer the work that Australia has been doing, and the Security of Critical Infrastructure Act 2018—SOCI—was genuinely a good standard and has set a good bar for expectations around the world. The Act has rigorous reporting requirements and caveats and guardrails for Government step-in powers. It also covers things like ransomware, which we know the UK Home Office is looking at, and Internet of Things security, which the UK Government recently looked at. Those are probably the two comparators. We hope that the CSRB will take the UK a big step towards that, but as a lot of my colleagues have said, there is a lot of work to do in terms of seeing the guidance and ensuring that it is implemented effectively.
Chris Anley: On the point about where we are perhaps falling behind, with streamlining of reporting we have already mentioned Australia and the EU, which is in progress. On protection of their defenders, other territories are already benefiting from those protections—the EU, the US, and I mentioned Portugal especially. As a third and final point, Australia is an interesting one, as it is providing a cyber-safety net to small and medium-sized enterprises, which provides cyber expertise from the Government to enable smaller entities to get up to code and achieve resilience where those entities lack the personnel and funding.
Emily Darlington
Q
Dr Ian Levy: The previous set of witnesses talked about board responsibility around cyber-security. In my experience, whether a board is engaged or not is a proxy indicator for whether they are looking at risk management properly, and you cannot change corporate culture through regulation—not quickly. There is something to be done around incentives to ensure that companies are really looking at their responsibilities across cyber-security. As the previous panellists have said, this is not just a technical thing.
One of the things that is difficult to reconcile in my head—and always has been—is trying to levy national security requirements on companies that are not set up to do that. In this case I am not talking about Amazon Web Services, because AWS invests hugely in security. We have a default design principle around ensuring that the services are secure and private by design. But something to consider for the Bill is not accidentally putting national security requirements on those entities that cannot possibly meet them.
When I was in government, in the past we accidentally required tiny entities, which could not possibly do so, to defend themselves against the Russians in cyber-space. If you translate that to any other domain—for example, saying that a 10-person company should defend itself against Russian missiles—it is insane, yet we do it in cyber-space. Part of the flow-down requirements that we see for contracting, when there is a Bill like this one, ends up putting those national security requirements on inappropriate entities. I really think we need to be careful how we manage that.
Matt Houlihan: Can I make two very quick points?
The Chair
Very briefly—yes.
Matt Houlihan: My first point is on the scale of the challenge. From Cisco’s own research, we released a cyber-security readiness index, which was a survey of 8,000 companies around the world, including in the UK, where we graded companies by their cyber maturity. In the UK, 8% of companies—these are large companies—were in the mature bracket, which shows the scale of the challenge.
The other point I want to make relates to its being a cyber-security and resilience Bill, and the “resilience” bit is really important. We need to focus on what that means in practice. There are a lot of cyber measures that we need to put in place, but resilience is about the robustness of the technology being used, as well as the cyber-security measures, the people and everything else that goes with it. Looking at legacy technology, for example—obsolete technology, which is more at risk—should also be part of the standards and, perhaps, the regulatory guidance that is coming through. I know that the public sector is not part of the Bill, but I mention the following to highlight the challenge: over a year ago, DSIT published a report that showed, I think, that 28% of Government systems were in the legacy, unsupported, obsolete bracket. That highlights the nature of the challenge in this space.
(5 days, 19 hours ago)
Public Bill Committees
David Chadwick
Q
Ian Hulme: As we have already explained, the current regs do not allow us to share the information, which is a bit of a barrier for us. In the future, certainly, we will be working together to try to figure it out. I think that there is also a role for DSIT in that.
Natalie Black: First, we currently have a real problem in that information sharing is much harder than it should be. The Bill makes a big difference in addressing that point, not only among ourselves but with DSIT and NCSC. Secondly, we think that there is an opportunity to improve information reporting, particularly incident reporting, and we would welcome working with DSIT and others—I have mentioned the Digital Regulation Cooperation Forum—to help us find a way to make it easier for industry, because the pace at which we need to move means that we want to ensure that there is no unnecessary rub in the system.
Emily Darlington (Milton Keynes Central) (Lab)
Q
Ian Hulme: We need to think about this as essentially two different regimes. The requirements under data protection legislation to report a data breach are well established, and we have teams, systems and processes that manage all that. There are some notable cases that have been in the public domain in recent months where we have levied fines against organisations for data breaches.
The first thing to realise is that we are still talking about only quite a small sub-sector—digital service providers, including cloud computing service providers, online marketplaces, search engines and, when they are eventually brought into scope, MSPs. A lot of MSPs will provide services for a lot of data controllers so, as I explained, if you have the resilience and security of information networks, that should help to make data more secure in the future.
Lincoln Jopp (Spelthorne) (Con)
Q
I have dealt with the ICO before. Maybe it was the company that I worked in and led, but there was a culture there that, if you had a data breach, you told the ICO. There was no question about it. How are you going to develop your reactions and the behaviours you reward in order to encourage a set of behaviours and cultures of openness within the corporate sector, bearing in mind that, as was said this morning, by opening that door, companies could be opening themselves up to a hefty fine?
Stuart Okin: In the energy sector, we have that culture. It is one of safety and security, and the chief executives and the heads of security really lean into it and understand that particular space. There are many different forums where they communicate and share that type of information with each other and with us. Incident response is really the purview of DESNZ rather than us, but they will speak to us about that from a regulatory perspective.
Ian Hulme: From the ICO’s perspective, we receive hundreds of data-breach reports. The vast majority of those are dealt with through information and guidance to the impacted organisation. It is only a very small number that go through to enforcement activity, and it is in only the most egregious cases—where failures are so egregious that, from a regulatory perspective, it would be a failure on our part not to take action.
I anticipate that is the approach we will take in the future when dealing with the instant reporting regime that the Bill sets out. Our first instinct would be to collaborate with organisations. Only in the most egregious cases would I imagine that we would look to exercise the full range of our powers.
Natalie Black: From Ofcom’s point of view, we have a long history, particularly in the telecoms sector, of dealing with a whole range of incidents, but I certainly hear your point about the victim. When I have personally dealt with some of these incidents, often you are dealing with a chief executive who has woken up that morning to the fact that they might lose their job and they have very stressed-out teams around them. It is always hard to trust the initial information that is coming out because no one really knows what is going on, certainly for the first few hours, so it is the maturity and experience that we would want to bring to this expanded role when it comes to data centres.
Ultimately the best regulatory relationships I have seen is where there is a lot of trust and openness that a regulator is not going to overreact. They are really going to understand what is going on and are very purposeful about what they are trying to achieve. From Ofcom’s point of view it is always about protecting consumers and citizens, particularly with one eye on security, resilience and economic growth. The experience we have had over the years means that we can come to those conversations with a lot of history, a lot of perspective, and, to be honest, a bit of sympathy because sometimes those moments are very difficult for everyone involved.
Tim Roca
Q
Chung Ching Kwong: It is always a double-edged sword when it comes to regulating against threats. The more that the Secretary of State or the Government are allowed to go into systems and hold powers to turn off, or take over, certain things, the more there is a risk that those powers will be abused, to a certain extent, or cause harm unintentionally. There is always a balance to be struck between giving more protection to privacy for ordinary users and giving power to the Government so that they can act. Obviously, for critical infrastructure like the power grid and water, the Government need control over those things, but for communications and so on, there is, to a certain extent, a question about what the Government can and cannot do. But personally I do not see a lot of concerns in the Bill.
Emily Darlington
Q
Chung Ching Kwong: It should definitely be covered by the Bill, because if we are not regulating to protect hardware as well, we will get hardware that is already embedded with, for example, an opcode attack. Examples in the context of China include the Lenovo Superfish scandal in 2015, in which originally implemented ad software had hijacked the https certificate, which is there to protect your communication with the website, so that nobody sees what activity is happening between you and the website. Having that Superfish injection made that communication transparent. That was done before the product even came out of the factory. This is not a problem that a software solution can fix. If you were sourcing a Lenovo laptop, for example, the laptop, upon arrival, would be a security breach, and a privacy breach in that sense. We should definitely take it a step further and regulate hardware as well, because a lot of the time that is what state-sponsored attacks target as an attack surface.
The Chair
That brings us nicely to the end of the time allotted for the Committee to ask questions. On behalf of the Committee, I thank our witness for her evidence.
Examination of Witness
Professor John Child gave evidence.
Dr Gardner
Q
Richard Starnes: True, but I would submit that under the Companies Act that liability is already there for all the directors; it just has not been used that way.
Emily Darlington
Q
Richard Starnes: You just stepped on one of my soapbox issues. I would like to see the code of practice become part of the annual Companies House registrations for every registered company. To me, this is an attestation that, “We understand cyber-security, we’ve had it put in front of us, and we have to address it in some way.”
One of the biggest problems, which Andy talked about earlier, is that we have all these wonderful things that the Government are doing with regard to cyber-security, down to the micro-level companies, but there are 5.5 million companies in the United Kingdom that are not enterprise-level companies, and the vast majority of them have 25 employees or fewer. How do we get to these people and say, “This is important. You need to look at this”? This is a societal issue. The code of practice and having it registered through Companies House are the way to do that. We need to start small and move big. Only 3% of businesses are involved in Cyber Essentials, which is just that: the essentials. It is the baseline, so we need to start there.
David Chadwick
Q
Richard Starnes: Throughout my career, I have been involved in cyber incidents from just about day one. One of the biggest problems that you run into in the first 72 hours, for example, is actually determining whether you have been breached. Just because it looks bad does not mean it is bad. More times than not, you have had indicators of compromise, and you have gone through the entire chain, which has taken you a day, or maybe two or three days, of very diligent work with very clever people to determine that, no, you have not been breached; it was a false positive that was difficult to track down. Do you want to open the door to a regulator coming in and then finding out it is a false positive?
You are also going to have a very significant problem with the amount of alerts that you get with a 24-hour notification requirement, because there is going to be an air of caution, particularly with new legislation. Everybody and his brother is going to be saying, “We think we’ve got a problem.” Alternatively, if they do not, then you have a different issue.
(3 weeks, 6 days ago)
Commons Chamber
Emily Darlington (Milton Keynes Central) (Lab)
I thank the Secretary of State for her absolutely clear message that what X is doing, through the use of Grok, is illegal. That is as much the platform’s responsibility as it is the user’s. I am afraid that there is less confidence in Ofcom’s ability to enforce the Online Safety Act as it stands, or in the improvements being made. Does she agree with the many people across the country who believe that we need to see real action from Ofcom by the end of this week, or we will judge Ofcom’s leadership as failing the British public?
My hon. Friend is a powerful champion on this issue. I am a feminist; I believe in deeds, not words. The deeds and the action will provide the proof that the very tough legislation already in place must be implemented—British rule of law. Ofcom needs to act, and swiftly.
(1 month, 4 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Emily Darlington (Milton Keynes Central) (Lab)
It is a pleasure to serve under your chairship, Ms Butler. I thank the hon. Member for Dewsbury and Batley (Iqbal Mohamed) for securing this important debate.
It would be remiss of me, as the MP for Milton Keynes Central, not to acknowledge the opportunities of AI. One in three jobs in Milton Keynes is in tech, often in the edge technologies or edge AIs that are driving the economic growth we want. However, we will not see take-up across businesses unless we have the safest AI, so we must listen to the British Standards Institution, which is located in Milton Keynes and is working on standards for some of these things.
Nevertheless, I have many concerns. The Molly Rose Foundation has raised many issues around AI chatbots, not all of which are included in current legislation. It has documented how Alexa instructed a 10-year-old to touch a live electrical wire, and how Snapchat’s My AI told a 13-year-old how to lose their virginity to a 31-year-old—luckily, it was an adult posing as a 13-year-old. We have seen other examples involving suicide, and Hitler having the answers to climate change, and research has found that many children are unable to realise that chatbots are not human. AI algorithms also shadow ban women and women’s health, as others have mentioned.
The tech is there to make AI safe, but there is little incentive for companies to do so at the moment. The Online Safety Act goes some way, but not far enough. Our priorities must be to tackle the creativity and copyright issues; deepfakes and the damage they do, in particular, to young girls and women; and the misinformation and disinformation that is being spread and amplified by algorithms because it keeps people online longer, making companies money. We must also protect democracy, children, minorities and women.
How do we do that? I hope the Minister is listening. For me, it is about regulation and standards—standards are just as important as regulation—and transparency. The Science, Innovation and Technology Committee has called for transparency on AI algorithms and AI chatbots, but we have yet to see real transparency. We must also have more diversity in tech—I welcome the Secretary of State’s initiatives on that—and, finally, given the world we are in, we must have a clear strategy for the part that sovereignty in AI plays in our security and our economic future.
Order. I would like to try to allow two minutes at the end for the Member in charge to wind up the debate. Will the Front Benchers take that into account, please?
Kanishka Narayan
My hon. Friend brings deep expertise from her past career. If she feels there are particular absences in the legislation on equalities, I would be happy to take a look, though that has not been pointed out to me, to date.
The Online Safety Act 2023 requires platforms to manage harmful and illegal content risks, and offers significant protection against harms online, including those driven by AI services. We are supporting regulators to ensure that those laws are respected and enforced. The AI action plan commits to boosting AI capabilities through funding, strategic steers and increased public accountability.
There is a great deal of interest in the Government’s proposals for new cross-cutting AI regulation, not least shown compellingly by my right hon. Friend the Member for Oxford East (Anneliese Dodds). The Government do not speculate on legislation, so I am not able to predict future parliamentary sessions, although we will keep Parliament updated on the timings of any consultation ahead of bringing forward any legislation.
Notwithstanding that, the Government are clearly not standing still on AI governance. The Technology Secretary confirmed in Parliament last week that the Government will look at what more can be done to manage the emergent risks of AI chatbots, raised by my hon. Friend the Member for York Outer (Mr Charters), my right hon. Friend the Member for Oxford East, my hon. Friend the Member for Milton Keynes Central and others.
Alongside the comments the Technology Secretary made, she urged Ofcom to use its existing powers to ensure AI chatbots in scope of the Act are safe for children. Further to the clarifications I have provided previously across the House, if hon. Members have a particular view on where there are exceptions or spaces in the Online Safety Act on AI chatbots that correlate with risk, we would welcome any contribution through the usual correspondence channels.
Kanishka Narayan
I have about two minutes, so I will continue the conversation with my hon. Friend outside.
We will act to ensure that AI companies are able to make their own products safe. For example, the Government are tackling the disgusting harm of child sexual exploitation and abuse with a new offence to criminalise AI models that have been optimised for that purpose. The AI Security Institute, which I was delighted to hear praised across the House, works with AI labs to make their products safer and has tested over 30 models at the frontier of development. It is uniquely the best in the world at developing partnerships, understanding security risks, and innovating safeguards, too. Findings from AISI testing are used to strengthen model safeguards in partnership with AI companies, improving safety in areas such as cyber-tasks and biological weapon development.
The UK Government do not act alone on security. In response to the points made by the hon. Members for Ceredigion Preseli (Ben Lake), for Harpenden and Berkhamsted, and for Runnymede and Weybridge, it is clear that we are working closely with allies to raise security standards, share scientific insights and shape responsible norms for frontier AI. We are leading discussions on AI at the G7, the OECD and the UN. We are strengthening our bilateral relationships on AI for growth and security, including AI collaboration as part of recent agreements with the US, Germany and Japan.
I will take the points raised by the hon. Members for Dewsbury and Batley, for Winchester (Dr Chambers) and for Strangford, and by my hon. Friend the Member for York Outer (Mr Charters) on health advice, and how we can ensure that the quality of NHS advice is privileged in wider AI chatbot engagement, as well as the points made by my hon. Friend the Member for Congleton and my right hon. Friend the Member for Oxford East on British Sign Language standards in AI, which are important points that I will look further at.
To conclude, the UK is realising the opportunities for transformative AI while ensuring that growth does not come at the cost of security and safety. We do this through stimulating AI safety assurance markets, empowering our regulators and ensuring our laws are fit for purpose, driving change through AISI and diplomacy.
(3 months, 3 weeks ago)
Commons ChamberWe absolutely will not. If the hon. Gentleman would like to write to me with more detail about areas and groups of people in his constituency who are digitality excluded, I will make a commitment to doing everything possible to tackle that problem.
Emily Darlington (Milton Keynes Central) (Lab)
I think that many Members have fundamentally misunderstood the proposal. It is actually about putting power in the hands of the citizen, not the state. The state already holds this information; digital ID will allow citizens to access it. On fraud, £11.4 billion was lost in scams last year, and £1.8 billion per year is lost due to identity theft. Does the Secretary of State see a role for digital ID in cracking down on the growing problem of fraud and identity theft?
I absolutely do. The countries that have introduced digital ID have found that it helps to tackle fraud. People can lose forms of identity and they can be used by other people. The scheme will help to tackle that problem as well as make services more effective and efficient.
(7 months, 2 weeks ago)
Commons ChamberOne really important part of the industrial strategy we published on Monday and the sector plans within it is that we identified a problem many people in the UK face, which is that they have a really good idea but cannot take it to market because they do not have access to finance, in particular to capital, unless they are in London—and sometimes unless they are a man. We want to change all that, which is why we have said categorically that we are giving the British Business Bank much more significant power to be able to invest in these sectors. That will mean we are a powerhouse in precisely the way the hon. Member wants.
Emily Darlington (Milton Keynes Central) (Lab)
From the development of vaccines to the discovery of the structure of DNA, British medical innovation has played a fundamental role in changing the lives of people globally and extending the UK’s global influence. Our industrial strategy and forthcoming life sciences sector plan will put the UK at the very centre of global efforts.
Emily Darlington
As the Minister will know, Gavi and the Global Fund not only provide a global vaccine programmes and programmes on saving lives from malaria and HIV, but provide us with biosecurity and jobs in the UK, not least over 500 research and development jobs and funding for the institute of tropical medicine. What assessment has he made of whether the UK is to reduce our efforts in that regard?
Gavi, the Vaccine Alliance is absolutely essential, not only for other countries in the world, where we have managed to save many lives by introducing vaccines, but for UK innovation. We are fully committed to Gavi. We will be producing our life sciences sector plan soon, and we want to celebrate the sector, which represents 6,800 business and £100 billion of turnover every year.
(7 months, 4 weeks ago)
Commons Chamber
Emily Darlington (Milton Keynes Central) (Lab)
I thank the hon. Member for his creativity in his speech. The heart of the debate is whether creatives are asked before we steal their material or style, but also that they are remunerated for that. That is a commitment we have heard from the Minister and from the Secretary of State in his media performances on the weekend. This problem predates this year. It dates back to stuff being stolen over a considerable number of years. Why did the last Government not take any initiative to ensure that creatives receive their just rewards for their creativity?
What the last Government did not do is release a consultation that had a ministerial foreword to say that the position of copyright was uncertain. What they did not do was say their preferred option was opt-out, which spooked the creative industry and caused all these problems in the first place. It is this Government’s ham-fisted approach that caused so many of the problems that they are now trying and failing to fix. The Government have played a large part in creating this problem.
(8 months, 2 weeks ago)
Commons ChamberIn all sincerity, I am confused by the hon. Gentleman’s intervention. The Bill before us does not mention AI or copyright—it has nothing to do with those items. The Data (Use and Access) Bill is as I described at the beginning of my remarks. If there is a clause, sentence or paragraph of the legislation that is before us and for consideration that damages either the AI sectors or the creative industries, then I would like him to stand up and read that out. What I am proposing is a comprehensive solution in legislation to both the opportunities and the challenges presented to the AI sector, which is a barrier for companies in that sector investing here, and to the current direction of travel that is posing an existential threat to the nature of the creative sector as we know it. That is what I am proposing, and I assure the hon. Gentleman that the Bill before us does not damage any of those interests in the way that he suggests.
Emily Darlington (Milton Keynes Central) (Lab)
I thank my right hon. Friend for the way in which he is comprehensively showing our commitment to the creative industries. Like him, I am a huge nerd when it comes to amazing new innovations in data and AI. I am hugely enthusiastic about them, but I also share his equally huge enthusiasm for the creative industries. I appreciate what he is saying about transparency—for me, that is the absolutely key point—but what is the backstop if the voluntary approach does not create the transparency that creators need to understand how their creations are being used and if they are being remunerated properly for that?
The reason I opened the consultation in the first place was to try to understand where the concerns are and where the tech companies can provide their suggested solutions, on the back of which we can come together as two Houses of Parliament and two separate sectors to find the way forward. If we cannot answer the question that my hon. Friend poses, how will the legislation, which I will propose as soon as I can, get through either House?
We need to bring both sectors together. We need to have workable, implementable solutions that have grip in the digital age as well as the confidence of both Houses to get the legislation forward. I ask Members across this House and in the other place to offer me a degree of trust that I will lead it through the difficult, challenging process—which has bedevilled not just the current situation, but the Government before—needed to deliver it for those who need it the most.
Victoria Collins
I thank my hon. Friend for that intervention. Indeed, I was going to say that every MP here will have local creatives. The Secretary of State talked about the artists represented in his office. In mine is Tabitha Mary, who does pictures of around town in Harpenden and Berkhamsted. We have musicians such as Rock Chorus. In Hertfordshire we have film studios such as Elstree, Warner Bros Leavesden and Bovingdon, and I am that sure many of them will be following this debate. Their life’s work—their paintings, performances or films—are feeding into AI systems right now and they have no way of knowing. They want to work hand in hand with technology and AI, but our creatives are calling for a fair playing field.
This is just the tip of the iceberg. I was recently approached by the British Deaf Association with concerns about AI companies requiring vast volumes of video footage of British Sign Language without proper consent or awareness from the original signers. I was curious about what an AI would say about the issue, so I asked it to write in the style of the hon. Member for Rhondda and Ogmore (Chris Bryant). I did not ask it for the Secretary of State. As the hon. Member for Rhondda and Ogmore is a Minister in the Department for Science, Innovation and Technology and the Department for Culture, Media and Sport, I thought it would be quite interesting.
I highlight that the AI said that the Minister was known for his bold, articulate style, often weaving sharp analysis with a touch of wit, but I will let others be the judge of whether the AI has encapsulated that. This is what it had to say:
“Ladies and gentlemen, we stand at a crossroads—one where technology and creativity are colliding, not in conflict, but in collaboration. For centuries, imagination has driven progress, shaping our world through art, literature, design, and innovation. And now, AI is in the mix, not to replace human ingenuity, but to challenge it, expand it, and sometimes even surprise us. The question isn’t whether AI is coming—it’s here. The real question is: how do we harness it to elevate human creativity, rather than diminish it?”
It asks a very good question. To elevate that human creativity, creatives are calling for transparency. Today we are at that crossroads, with an opportunity to vote for that transparency.
As the Bill nears completion, I urge the Government to accept this reasonable compromise. I accept that the tone and the movement today are welcome, and that work with creatives and tech is much-needed.
Emily Darlington
Does the hon. Member agree that what that quote proves is that AI cannot capture the wit and humour that my hon. Friend the Member for Rhondda and Ogmore (Chris Bryant) brings to this Chamber?
Emily Darlington
And fashion. In fact, AI is a poor copy of what my hon. Friend represents and bring.
Victoria Collins
I think the House has spoken on that. True leadership in AI means building on respect for creativity, including in the House of Commons, not exploitation. We can build an AI-powered future where technology and human ingenuity flourish together, but only if we start with transparency. We can be a world leader in setting a standard for creatives and technology to work together. I invite all colleagues from all parties to join us today in supporting amendment 49D, to set that direction and to stand up for transparency for our creators and for the principle that, in the age of AI, human creativity still matters.
(9 months, 2 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Emily Darlington (Milton Keynes Central) (Lab)
I thank my hon. Friend for securing this important debate—we can see how important it is by the huge amount of people who have come to see the debate and who want to speak in it. Is he aware of the recent DACS survey of visual artists, most of whom live on pay under the minimum wage? That survey showed that 84% of artists would agree to license their work for fair remuneration. That would require a technical solution that is embedded in the metadata that is respected by AI and platforms. At the moment, anything uploaded on to our social media platforms has that metadata scraped. Does my he agree that, in looking for solutions, the Government need to make sure that we legislate with that in mind?
I agree, and as I am about to say, there is ample proof of the stripping away of that very metadata, which could be the identifying feature when it is being used and scraped. With AI models, rights holders cannot see what is being used. This is not a crisis of legislation; it is an absence of transparency, attribution and recompense for the very content and resource that those giant machines are being built with and from.
(1 year ago)
Commons ChamberThe right hon. Gentleman is a persistent advocate of the issue that he has raised, but let me gently say to him that if the current legal regime were so satisfactory, there would not be so many outstanding court cases concerning that precise issue; it is clearly struggling to keep up with the time in which we are living. We want to ensure that, yes, we do strengthen the rights of the people who use the creative industries and all the great potential that that has for individual copyrighted material, and we want to strengthen that into the future, but also to get it right for the future. That is why we are thinking about the needs, demands and opportunities of the future, and making sure that the settlement for those creating digital, AI and creative industry products and services benefits them equally as we go forward, and that they have the assertion of the law.
Emily Darlington (Milton Keynes Central) (Lab)
I am equally excited by the opportunities that being a leader in AI can bring to the people of the UK. As my right hon. Friend will know, Milton Keynes has been a leader from its outset. We have Bletchley Park, the birthplace of machine learning and AI, but Milton Keynes businesses are leading as well, especially in arts, services and transport. The heart of our security services efforts is based there, as is our skills base between the South Central institute of technology, Cranfield university and the Open university. Will my right hon. Friend meet me, and other AI champions from Milton Keynes, to come up with actions to make this plan a reality for the people there?