Cyber Security and Resilience (Network and Information Systems) Bill

Emily Darlington Excerpts
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- View Speech - Hansard - -

I welcome the Bill and the cyber action plan for public services, which was published today. As we have heard from right hon. and hon. Members’ many great speeches today, this is so important to the UK economy and public.

Despite being one of the smaller countries in the world, we are still one of the biggest targets for cyber-attacks. In the past 12 months, there has been some good news: only four in 10 businesses and three in 10 charities have had cyber-security breaches—the figures are down on the previous year. However, there has been a huge increase in nationally significant cyber-incidents, which have more than doubled in the past year, including the malicious cyber-attacks on critical infrastructure by Russia and China.

These matters are important to companies based in Milton Keynes Central, where one in three jobs are in technology. Milton Keynes is a leader in the development of AI and tech services, including in legal services, financial services and autonomous vehicles. Those companies have experienced cyber-attacks, so the Bill is very welcome. The difficulty is that it misses a huge portion of the discussion, and Ministers have somewhat neglected to mention sovereign technology in their comments or in the strategy. I hope that they will do so in the wind-up.

One role of sovereign technology is to fight cyber-crime. There are many definitions of sovereign technology, so what does it actually mean? To me, most of the public and the industry, it means UK innovation and technology. It is developed in the UK and is UK-owned intellectual property. It means a company paying UK taxes. Most importantly, it means a UK company being accountable to the UK. The Government have talked a lot about their commitment to developing and securing sovereignty, but that needs to be extended to all critical technology and infrastructure. Not only is that important in cyber-security terms, but it has other advantages, too: it is good for the economy, creates innovation and sets the highest standards, and it thereby gets public support and confidence and achieves small business support for absorbing the innovation. It achieves growth by creating not only UK customers, but—ambitiously—worldwide customers.

The Government have done that quite well in the past. They have created safe and secure solutions. Crown Hosting Data Centres is a really good example of a joint venture between the Government and Ark Data Centres. Unfortunately, only 3% to 4% of Government servers actually use it, and we must ask why. What are we doing to promote safe and secure solutions in the UK that would help us to fight for cyber-security and ensure that it is promoted across the public sector, and to ensure that those solutions gain support in the private sector? Instead of using Crown Hosting Data Centres, many are using ones run by foreign firms with securities and standards developed outside the UK. Outages at Amazon Web Services in cloud hosting have cost business millions.

Let us look at other areas where the public rightly worry about cyber-attacks and cyber-security, such as NHS data. We have heard about the impact of cyber-crimes on the NHS and on lives, but it also impacts public confidence. Palantir has a £330 million contract to bring together all NHS data. That is a fantastic initiative and really important, and the public support it because they do not want to have to repeat their health story to each and every doctor, nurse or other health professional that they meet. The difficulty is that using a foreign firm with some questionable alliances has led to an erosion of public trust and to a lack of trust among doctors, slowing the take-up of this important innovation in NHS services. That is partly because the co-founder of Palantir called our pride in the NHS “Stockholm syndrome”. Unfortunately, he misunderstands the very body to which he is selling services and is thereby eroding public trust. I know many UK firms that could have done just as good a job—and probably better, because trust among the public and doctors would have increased.

We hear that Palantir has just won a £240 million contract with the Ministry of Defence for

“data analytics capabilities supporting critical strategic, tactical and live operational decision making across classifications”.

Again, it is hugely important that we are using the latest technology to promote our MOD and that we are tying all that up. I do not think anybody in this House has concerns about the MOD making these kinds of investments; it is who we choose to partner with that drives the concern.

As I have already argued, the reality is that cyber-security has to be UK-focused. We have to protect our national interest and ensure that our partners put our national interest and cyber-security first and foremost. The views of organisations such as Palantir on the NHS and its integration into US Immigration and Customs Enforcement—otherwise known as ICE—lead us to worry that it does not share UK values. It creates a strategic vulnerability. That is what the sector is saying to us, and we should listen to it. Cyber-security is not just about reporting; it is about the investments we make ahead of time. Imagine if those two contracts and their economic opportunities had been given to UK firms. There would be enhanced UK-based cyber-security and greater confidence in our most critical areas of health and the military.

Let me raise another example which, if The Daily Telegraph is correct, I am sure will raise significant public trust concerns. It has reported today that the Government are considering using Starlink for the emergency services network, replacing the existing radio set-up that is used by ambulances, police and the fire service in an emergency—our most critical infrastructure. This company is controlled by a man who has shown his willingness to turn off satellites in Ukraine at his own political whim.

Cameron Thomas Portrait Cameron Thomas (Tewkesbury) (LD)
- Hansard - - - Excerpts

The hon. Lady is making a really important point about Elon Musk’s Starlink system, but will she go a little further and recognise that not only has Elon Musk switched off Starlink in Ukraine at will, but he has done so on occasions that might have turned the tide of the war?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I thank the hon. Member for raising that point. It is important to note that Elon Musk turned off Starlink at very strategic points for the Ukrainian military when it was advancing on Russian-held territory. It is not just that he chose to turn it off; he chose to turn it off at a critical time for the Ukrainian military. I worry that somebody who chooses to do that, and who encourages violence among the UK public at a far-right rally, at which he said,

“Whether you choose violence or not, violence is coming to you. You either fight back or you die”,

is not an appropriate or safe partner for our emergency services.

I absolutely support the comments made by my right hon. Friend the Member for Oxford East (Anneliese Dodds) about transparency, and about some of the actions being taken by those who have been willing to stand up to these companies and demand transparency. While that is probably not the subject of today’s debate, I think we must take those actions as a warning for what is to come.

I welcome the Bill and the action plan, but to truly make the UK safe and secure from state-sponsored or criminal cyber-attacks, we need to ensure that there is a UK sovereign infrastructure, capacity and capability. The Government can lead the way through their own procurement practices by making sure we are partnering with UK sovereign firms. That is good for security, good for protecting us against cyber-attacks, and good for the economy and public trust.

Online Safety Act 2023: Repeal

Emily Darlington Excerpts
Monday 15th December 2025

(3 weeks, 3 days ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Pritchard. I want to add some actual data to our debate today. We are inundated, often online or in our inboxes, with messages about repealing the Online Safety Act. These are well-funded campaigns. There is also a lot of material online coming from very particular sources, not necessarily within the UK. Actually, 70% of people in the UK support the Online Safety Act and a similar number support age verification. Much of that has to do with what our children are seeing online. Almost 80% of people aged 18 to 21 have seen sexual violence before age 18. That is a huge number of people whose initial sexual experiences or viewing of sex involves violence.

What does the Online Safety Act do? It puts porn back on the top shelf—it does not get rid of it. We are all of an age to remember when porn was on the top of the magazine rack in the corner shop. Now it is being fed to our children in their feeds. The issue is also the type and nature of porn that people are seeing online: 80% of online porn has some kind of strangulation in it. That has real-world consequences, as we have seen from the latest data on women’s health in terms of strokes. Strangulation is now the second leading cause of strokes among women in the UK. That is shocking, and it is why we needed the Online Safety Act to intervene on what was being fed to us.

In Milton Keynes, 30% of young people have been approached by strangers since the implementation of the Online Safety Act. They are most frequently approached on Roblox. We do not automatically identify gaming platforms as places where people are approached by strangers, but we know from police investigations that they approach young children on Roblox and move them to end-to-end encryption sites where they can ask them to share images.

In 2024, there were 7,263 online grooming offences—remember that those will just be the ones that are not in end-to-end encryption sites. There were 291,273 reports of child sexual abuse material identified last year—again, remember, that is not the material being shared on end-to-end encryption sites, because we have no idea what is actually being shared on those. Some 90% of that material is self-generated—that is, groomers asking children to take pornographic pictures of themselves and share them. Once a picture is shared with a groomer, it goes into networks and can get shared anywhere in the UK or the world. The UK is the biggest consumer of child sexual abuse images. The police reckon that 850,000 people in the UK are consuming child sexual abuse images.

John Slinger Portrait John Slinger
- Hansard - - - Excerpts

I thank my hon. Friend for making an impassioned and powerful speech. Does she agree that outrage ought to be directed at us for not doing enough on these issues rather than for the way in which we have started to try to tackle them?

If the behaviours that my hon. Friend and other hon. Members have referred to happened in the real world—the so-called offline world—they would be clamped down on immediately and people would be arrested. Certain items cannot be published, be put in newsagents or be smuggled into school libraries and people could not get away with the defence, “This is a matter of my civil liberty.” We should be far more robust with online companies for the frankly shoddy way in which they are carrying out their activities, which is endangering our children and doing immense damage to our political system and wider life in our country and beyond.

Emily Darlington Portrait Emily Darlington
- Hansard - -

I completely agree and I am going to come to that.

I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.

My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.

Lizzi Collinge Portrait Lizzi Collinge
- Hansard - - - Excerpts

My hon. Friend has been talking about the dangers that children are exposed to. Does she believe that parents are equipped to talk to their children about these dangers? Is there more we can do to support parents to have frank conversations with their children about the risks of sharing images and talking to people online?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I completely agree. As parents, we all want to be able to have those conversations, but because of the way the algorithms work, we do not see what they see. We say, “Yes, you can download this game, because it has a 4+ rating.” Who knows what a 4+ rating actually means? It has nothing to with the BBFC ratings that we all grew up with and understand really well. Somebody else has decided what is all right and made up the 4+ rating.

For example, Roblox looks as if it is child-ready, but many people might not understand that it is a platform on which anyone can develop a game. Those games can involve grooming children and sexual violence; they are not all about the silly dances that children do in the schoolyard. That platform is inhabited equally by children as it is by adults.

Jim McMahon Portrait Jim McMahon
- Hansard - - - Excerpts

My hon. Friend does well to draw attention to the gaming world. When most of us think about online threats, we think about social media and messaging, but there are interactive ways of communicating in almost every game in existence, and that can happen across the world.

In Oldham, we have had a number of section 60 stop-and-search orders in place, because of the number of schoolchildren who have been carrying knives and dangerous weapons. Largely, that has been whipped up not in the classroom, but online, overnight, when children are winding each other up and making threats to each other. That has real-life consequences: children have been injured and, unfortunately, killed as a result of carrying weapons in our community. Does my hon. Friend share my concern that this threat is multifaceted, and that the legislation probably should not be so prescriptive for particular platforms at a point in time, but should have founding principles that can be far more agile as new technology comes on stream?

Emily Darlington Portrait Emily Darlington
- Hansard - -

My hon. Friend raises two really important points. First, if we try to create legislation to address what companies do today, it will be out of date by the time that it passes through the two Houses. What we do must be done on the basis of principles, and I think a very good starting principle is that what is illegal offline should be illegal online. That is a pretty clear principle. Offline legislation has been robustly challenged over hundreds of years and got us to where we are with our freedom of speech, freedom of expression and freedom to congregate. All those things have been robustly tested by both Houses.

John Slinger Portrait John Slinger
- Hansard - - - Excerpts

On that critical point about the lack of equality between offline and online, does my hon. Friend agree that if I were to go out into the street and staple to somebody’s back an offensive but not illegal statement that was impermeable to being washed off and remained on their back for months, if not years, I would probably be subject to immediate arrest, yet online that happens routinely to our children—indeed, to anyone in society, including politicians? Is that not illustrative of the problem?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.

My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.

The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.

We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.

The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.

The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.

[Sir John Hayes in the Chair]

Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.

Tom Hayes Portrait Tom Hayes
- Hansard - - - Excerpts

We heard today from the MI6 chief, who talked about how Russia is seeking to “export chaos” into western democracies and said that the UK is one of the most targeted. Does my hon. Friend agree that we need online safety, because it is our national security too, and that as we face the rising threat from Putin and the Kremlin, we need as a country to be secure in the air, at sea, on land and in the digital space?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I absolutely agree with my hon. Friend. They seek to promote chaos and the destruction of British values, and we need to fight that and protect those values.

The AI nudifying apps, which did not even exist when the Online Safety Act came in, need a very fast response. We know that deepfakes and AI nudifying apps are being used overwhelmingly against classmates and colleagues. Think about how it destroys a 13-year-old girl to have a fake nude photo of her passed around. The abuse that we politicians and many others receive from fake and anonymous accounts needs to be addressed. Seventy-one per cent of British people consider this to be a problem, and we need to take action. AI chatbots are another thing that was not foreseen in the development of the Online Safety Act, and therefore it is far behind on them, too.

The Online Safety Act is in no way perfect, but it is a good step forward. We must learn the lessons of its implementation to go further and faster, and listen to British parents across the country who want the Government’s help to protect our children online—and we as a Government must also protect our democracy online.

--- Later in debate ---
Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.

We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.

Emily Darlington Portrait Emily Darlington
- Hansard - -

I wanted to raise with the Minister that the Science, Innovation and Technology Committee will be undertaking an inquiry in the new year on brain development, addictive use and how that impacts various key points in children’s development. The Minister says that he will look at all evidence. Will he look at the evidence produced by that inquiry to ensure that its information and advice goes to parents across this country?

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - -

I appreciate what the Minister says—that these powers are in legislation—yet the process is still the social media platforms marking their own homework. We are in a vicious circle: Ofcom will not take action unless it has a complaint based on evidence, but the evidence is not achievable because the algorithm is not made available for scrutiny. How should Ofcom use those powers more clearly ahead of the elections to ensure that such abuse to our democracy does not occur?

Ian Murray Portrait Ian Murray
- Hansard - - - Excerpts

A whole host of legislation sits behind this, including through the Electoral Commission and the Online Safety Act, but it is important for us to find ways to ensure that we protect our democratic processes, whether that be from algorithmic serving of content or foreign state actors. It is in the public domain that, when the Iranian servers went dark during the conflict with the US, a third of pro-independence Facebook pages in Scotland went dark, because they were being served by foreign state actors. We have seen that from Russia and various other foreign actors. We have to be clear that the regulations in place need to be implemented and, if they are not, we need to find other ways to ensure that we protect our democracy. At a small tangent, our public sector broadcasters and media companies are a key part of that.

To stay with my hon. Friend the Member for Milton Keynes Central (Emily Darlington), she made an excellent contribution, with figures for what is happening. She asked about end-to-end encryption. We support responsible use of encryption, which is a vital part of our digital world, but the Online Safety Act does not ban any service design such as end-to-end encryption, nor does it require the creation of back doors. However, the implementation of end-to-end encryption in a way that intentionally binds tech companies to content will have a disastrous impact on public safety, in particular for children, and we expect services to think carefully about their design choices and to make the services safe by design for children.

That leads me to online gaming platforms and Roblox, which my hon. Friend also mentioned. Ofcom has asked the main platforms, including Roblox, to share what they are doing and to make improvements where needed. Ofcom will take action if that is not advanced. A whole host of things are happening, and we need the Online Safety Act and the regulations underpinning it to take time to feed through. I hope that we will start to see significant improvements, as reflected on by my hon. Friend the Member for Sunderland Central.

My hon. Friend the Member for Milton Keynes Central mentioned deepfakes. That issue is important to our democracy as well. The Government are concerned about the proliferation of AI-enabled products and services that enable deepfake non-consensual images. In addition to criminalising the creation of non-consensual images, the Government are looking at further options, and we hope to provide an update on that shortly. It is key to protecting not only our wider public online but, fundamentally, those who seek public office.

The Government agree that a safer digital future needs to include small, personally owned and maintained websites. We recognise the importance that proportionate implementation of the Online Safety Act plays in supporting that aim. We can all agree that we need to protect children online, and we would not want low-risk services to have any unnecessary compliance burden. That is a balance that we have to strike to make it proportionate. The Government will conduct a post-implementation review of the Act and will consider the burdens on low-risk services as part of that review, as mentioned in the petition. We will also ensure that the Online Safety Act protects children and is nimble enough to deal with a very fast-moving tech world. I thank all hon. Members for providing a constructive debate and raising their issues. I look forward to engaging further in the months and years ahead.

Mandatory Digital ID

Emily Darlington Excerpts
Tuesday 21st October 2025

(2 months, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Jo White Portrait Jo White (Bassetlaw) (Lab)
- Hansard - - - Excerpts

Thank you, Mr Turner. Wow, that is a big announcement!

Just over a month ago I visited Tallinn, the capital of Estonia, a country that has been using digital ID for 30 years and a country we can learn from—how it works, how it reaches the digitally excluded and how it protects people’s security. What struck me most was that everyone I spoke to said the same thing: with digital ID, they know exactly what information the Government hold on them, and most importantly, they know who has looked at it and why.

That level of transparency and personal control should be the gold standard, but here it often feels the opposite: social media giants and private companies know more about us than we realise—often more, I would say, than our nearest and dearest. We need to have absolute control.

Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

It is interesting that my hon. Friend talks about the Estonian experience, as I often hear my constituents’ frustration that they do not know what the Government are doing with their data, and how they even have trouble accessing it. Does my hon. Friend think that a scheme like Estonia’s would help the citizen to be in charge?

Jo White Portrait Jo White
- Hansard - - - Excerpts

I totally agree with my hon. Friend.

From the moment we are born, the state begins to gather data: our birth is registered; the NHS stores our health records; we are issued with national insurance and NHS numbers; and His Majesty’s Revenue and Customs tracks us. By having a digital ID, we can see the information the state holds on us, who has been accessing it and why. We can even determine that other people cannot see our data. It is about us having control over our own data.

It is also about security, because the way it is divided and split up means there is absolute security as nobody can see data from one Department to another. It is about people having personal control, which is what people in my constituency are calling for.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington (Milton Keynes Central) (Lab)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Turner. For reasons of timing, I will not repeat what my hon. Friend the Member for Bassetlaw (Jo White) said about the important change in the relationship between citizen and state that could come from digital ID—putting the citizen in charge rather than the state knowing too much about us without our knowing what they know.

However, there is another reason why we might want a free, digital, Government-backed ID: £11 billion is lost each year to fraud, and ID theft costs us about £2 billion a year. People need to prove who they are at each and every moment. For too many people, that involves a passport or driver’s licence, which is not affordable for many. Having an ID that allows us to prove who we are could be more secure. We will also need it to show that we can work—there has been a 40% increase in illegal working—and to prove our age, including for the big changes made by the Online Safety Act 2023.

Peter Swallow Portrait Peter Swallow (Bracknell) (Lab)
- Hansard - - - Excerpts

My hon. Friend raises the Online Safety Act. Some of my constituents have raised concerns about identity checks to access material online. Would it not have been far easier to prove one’s age online safely and securely if we already had a digital ID, and would that not have helped us to introduce safer checks online?

Emily Darlington Portrait Emily Darlington
- Hansard - -

My hon. Friend is absolutely right. All the complaints I have received are about people giving their information to third-party verifiers. If they had a free, digital, Government-backed ID, they could have proved their age to access any over-18 content. People are also concerned that those who should not be accessing the NHS are doing so. The reality is that if there were a Government-backed digital ID, it would be clear whether a person can access the NHS.

I have come up with a list that debunks what the hon. Member for Perth and Kinross-shire (Pete Wishart) said, and I am happy to pass it to him afterwards. I think we need to add a few scientific facts, but I do not have time.

Jo White Portrait Jo White
- Hansard - - - Excerpts

I would like to hear some of my hon. Friend’s list, please.

Emily Darlington Portrait Emily Darlington
- Hansard - -

I am happy to go through it. First, it is not about centralising data. Rather, digital ID allows the citizen to access federated data. The data stays in the individual Departments; it does not stay on a card—this is not about a card. Digital ID adds a level of security to Government datasets. There is no travel or location data. There is no access to external providers. It uses sovereign tech that allows citizens to know what the Government hold and who is accessing it. There is no new data that the Government do not already hold, and a single login is actually better for a person to prove who they are with a digital ID.

David Davis Portrait David Davis
- Hansard - - - Excerpts

Why do the Government’s cyber experts disagree?

Emily Darlington Portrait Emily Darlington
- Hansard - -

I think the right hon. Member will find there is a split in the community because there is a lack of detail.

Emily Darlington Portrait Emily Darlington
- Hansard - -

I agree, but there is a lack of detail. When we are at the beginning of the conversation and going out to consultation, which is exactly what we are doing, we have to ask the public what they want. Do they want either of the two scenarios that my hon. Friend the Member for Bassetlaw and I presented, or do they not want access to their Government data in a way that enables them to know what is happening, and so that they can prove who they are without having to pay for a passport or driver’s licence?

--- Later in debate ---
Iqbal Mohamed Portrait Iqbal Mohamed (Dewsbury and Batley) (Ind)
- Hansard - - - Excerpts

I thank the hon. Member for Perth and Kinross-shire (Pete Wishart) for securing this debate. This is one of the most controversial and divisive issues currently supported by the Government, who have form. I am here on behalf of my constituents, as nearly 100 have written to me opposing the scheme, and nearly 4,000 have signed the e-petition.

We have heard the risks and the issues around data privacy, surveillance culture, user profiling, exclusion, focus creep and scope creep. Having worked in the IT industry for over 20 years, as well as in the cyber-security industry, I can say that there is no safe system at the moment. Relying on third-party software, owned by foreign states or companies—

Emily Darlington Portrait Emily Darlington
- Hansard - -

Is the hon. Member aware of the Government’s statements that the system would be held internally and use sovereign tech?

Iqbal Mohamed Portrait Iqbal Mohamed
- Hansard - - - Excerpts

I am, but that will not solve the issue.