(1 day, 13 hours ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Emily Darlington (Milton Keynes Central) (Lab)
It is a pleasure to serve under your chairmanship, Mr Pritchard. I want to add some actual data to our debate today. We are inundated, often online or in our inboxes, with messages about repealing the Online Safety Act. These are well-funded campaigns. There is also a lot of material online coming from very particular sources, not necessarily within the UK. Actually, 70% of people in the UK support the Online Safety Act and a similar number support age verification. Much of that has to do with what our children are seeing online. Almost 80% of people aged 18 to 21 have seen sexual violence before age 18. That is a huge number of people whose initial sexual experiences or viewing of sex involves violence.
What does the Online Safety Act do? It puts porn back on the top shelf—it does not get rid of it. We are all of an age to remember when porn was on the top of the magazine rack in the corner shop. Now it is being fed to our children in their feeds. The issue is also the type and nature of porn that people are seeing online: 80% of online porn has some kind of strangulation in it. That has real-world consequences, as we have seen from the latest data on women’s health in terms of strokes. Strangulation is now the second leading cause of strokes among women in the UK. That is shocking, and it is why we needed the Online Safety Act to intervene on what was being fed to us.
In Milton Keynes, 30% of young people have been approached by strangers since the implementation of the Online Safety Act. They are most frequently approached on Roblox. We do not automatically identify gaming platforms as places where people are approached by strangers, but we know from police investigations that they approach young children on Roblox and move them to end-to-end encryption sites where they can ask them to share images.
In 2024, there were 7,263 online grooming offences—remember that those will just be the ones that are not in end-to-end encryption sites. There were 291,273 reports of child sexual abuse material identified last year—again, remember, that is not the material being shared on end-to-end encryption sites, because we have no idea what is actually being shared on those. Some 90% of that material is self-generated—that is, groomers asking children to take pornographic pictures of themselves and share them. Once a picture is shared with a groomer, it goes into networks and can get shared anywhere in the UK or the world. The UK is the biggest consumer of child sexual abuse images. The police reckon that 850,000 people in the UK are consuming child sexual abuse images.
John Slinger
I thank my hon. Friend for making an impassioned and powerful speech. Does she agree that outrage ought to be directed at us for not doing enough on these issues rather than for the way in which we have started to try to tackle them?
If the behaviours that my hon. Friend and other hon. Members have referred to happened in the real world—the so-called offline world—they would be clamped down on immediately and people would be arrested. Certain items cannot be published, be put in newsagents or be smuggled into school libraries and people could not get away with the defence, “This is a matter of my civil liberty.” We should be far more robust with online companies for the frankly shoddy way in which they are carrying out their activities, which is endangering our children and doing immense damage to our political system and wider life in our country and beyond.
Emily Darlington
I completely agree and I am going to come to that.
I recently met the NSPCC, the Internet Watch Foundation and the police forces that deal with this issue, and they told me that there are easy technological fixes when someone uploads something to a site with end-to-end encryption. For those who do not know, we use such sites all the time—our WhatsApp groups, and Facebook Messenger, are end-to-end encryption sites. We are not talking about scary sites that we have not heard of, or Telegram, which we hear might be a bit iffy; these are sites that we all use every single day. Those organisations told me that, before someone uploads something and it becomes encrypted, their image or message is screened. It is screened for bugs to ensure that they are not sharing viruses, but equally it could be screened for child sexual abuse images. That would stop children even sharing these images in the first place, and it would stop the images’ collection and sharing with other paedophiles.
My hon. Friend the Member for Rugby (John Slinger) is absolutely right: 63% of British parents want the Government to go further and faster, and 50% feel that our implementation has been too slow. That is not surprising; it took seven years to get this piece of legislation through, and the reality is that, by that time, half of it was out of date, because technology moves faster than Parliament.
Lizzi Collinge
My hon. Friend has been talking about the dangers that children are exposed to. Does she believe that parents are equipped to talk to their children about these dangers? Is there more we can do to support parents to have frank conversations with their children about the risks of sharing images and talking to people online?
Emily Darlington
I completely agree. As parents, we all want to be able to have those conversations, but because of the way the algorithms work, we do not see what they see. We say, “Yes, you can download this game, because it has a 4+ rating.” Who knows what a 4+ rating actually means? It has nothing to with the BBFC ratings that we all grew up with and understand really well. Somebody else has decided what is all right and made up the 4+ rating.
For example, Roblox looks as if it is child-ready, but many people might not understand that it is a platform on which anyone can develop a game. Those games can involve grooming children and sexual violence; they are not all about the silly dances that children do in the schoolyard. That platform is inhabited equally by children as it is by adults.
My hon. Friend does well to draw attention to the gaming world. When most of us think about online threats, we think about social media and messaging, but there are interactive ways of communicating in almost every game in existence, and that can happen across the world.
In Oldham, we have had a number of section 60 stop-and-search orders in place, because of the number of schoolchildren who have been carrying knives and dangerous weapons. Largely, that has been whipped up not in the classroom, but online, overnight, when children are winding each other up and making threats to each other. That has real-life consequences: children have been injured and, unfortunately, killed as a result of carrying weapons in our community. Does my hon. Friend share my concern that this threat is multifaceted, and that the legislation probably should not be so prescriptive for particular platforms at a point in time, but should have founding principles that can be far more agile as new technology comes on stream?
Emily Darlington
My hon. Friend raises two really important points. First, if we try to create legislation to address what companies do today, it will be out of date by the time that it passes through the two Houses. What we do must be done on the basis of principles, and I think a very good starting principle is that what is illegal offline should be illegal online. That is a pretty clear principle. Offline legislation has been robustly challenged over hundreds of years and got us to where we are with our freedom of speech, freedom of expression and freedom to congregate. All those things have been robustly tested by both Houses.
John Slinger
On that critical point about the lack of equality between offline and online, does my hon. Friend agree that if I were to go out into the street and staple to somebody’s back an offensive but not illegal statement that was impermeable to being washed off and remained on their back for months, if not years, I would probably be subject to immediate arrest, yet online that happens routinely to our children—indeed, to anyone in society, including politicians? Is that not illustrative of the problem?
Emily Darlington
I agree; my hon. Friend makes a very important point about the slander that happens online, the lack of basis in reality and the lack of ability to address it. If somebody posts something about someone else that is untrue, platforms will not take it down; they will say, “It doesn’t breach our terms and conditions.” Somebody could post that I am actually purple and have pink eyes. I would say, “I don’t want you to say that,” and the platform would say, “But there’s nothing offensive about it.” I would say, “But it’s not me.” The thing is that this is happening in much more offensive ways.
My hon. Friend the Member for Oldham West, Chadderton and Royton (Jim McMahon) made the point that what happens online is then repeated offline. We have even seen deaths when children try to replicate the challenges that they see being set online. With AI-generated material, those challenges often are not real. It is the equivalent of somebody trying to repeat magic tricks and dying as a result, which is quite worrying.
The Online Safety Act is not perfect; it needs to go further. The petitioner has made a really important point. The lack of proper definition around small but non-harmful sites versus small but harmful sites is very unclear, and it is really important that the Act provides some clarity on that.
We do not have enough protections for democracy. The Science, Innovation and Technology Committee, which I am a member of, produced a really important report on misinformation and how it led to the riots two summers ago. Misinformation was used as a rallying cry to create unrest across our country of a sort that we had not seen in a very long time. The response from the social media companies was variable; it went from kind of “meh” to really awful. The platforms say, “We don’t police our content. We’re just a platform.” That is naive in the extreme. Quite frankly, they are happy to make money off us, so they should also know that they have to protect us—their customers—just as any other company does, as my hon. Friend the Member for Oldham West, Chadderton and Royton said.
The radicalisation that is happening online is actually shifting the Overton window; we are seeing a more divided country. There is a fantastic book called “Man Up”—it is very academic, but it shows the rise of misogyny leading to the rise of every other form of extremism and how that links back to the online world. If this was all about Islam, this House would be outraged, but because it starts with misogyny, it goes down with a fizzle, and too often people in this House say, “This is all about free speech.” We know that misogyny is the first step on a ladder of radicalisation that leads people to violence—whether into violence against women or further into antisemitism, anti-Islam, anti-anybody who is not the same colour, or anti-anybody who is perceived not to be English from Norman times.
The algorithms provoke violent and shocking content, but they also shadow-ban really important content, such as information on women’s health. Platforms are happy to shadow-ban terms such as “endometriosis” and “tampon”—and God forbid that a tampon commercial should feature red liquid, rather than blue liquid. That content gets shadow-banned and is regularly taken down and taken out of the algorithms, yet the platforms say they can do nothing about people threatening to rape and harm. That is not true; they can, and they choose not to. The public agree that algorithms must be part of the solution; 78% of British parents want to see action on algorithms. My hon. Friends are right that the Online Safety Act and Ofcom could do that, yet they have not done so—they have yet to create transparency in algorithms, which was the Select Committee’s No. 1 recommendation.
[Sir John Hayes in the Chair]
Finally, I want to talk about a few other areas in which we need to move very quickly: deepfakes and AI nudifying apps. We have already seen an example of how deepfakes are being used in British democracy: a deepfake was made of the hon. Member for Mid Norfolk (George Freeman) saying that he is moving from the Conservatives to Reform. It is a very convincing three-minute video. Facebook still refuses to take it down because it does not breach its terms. This should be a warning to us all about how individuals, state actors and non-state actors can impact our local democracy by creating deepfakes of any one of us that we cannot get taken down.
Tom Hayes
We heard today from the MI6 chief, who talked about how Russia is seeking to “export chaos” into western democracies and said that the UK is one of the most targeted. Does my hon. Friend agree that we need online safety, because it is our national security too, and that as we face the rising threat from Putin and the Kremlin, we need as a country to be secure in the air, at sea, on land and in the digital space?
Emily Darlington
I absolutely agree with my hon. Friend. They seek to promote chaos and the destruction of British values, and we need to fight that and protect those values.
The AI nudifying apps, which did not even exist when the Online Safety Act came in, need a very fast response. We know that deepfakes and AI nudifying apps are being used overwhelmingly against classmates and colleagues. Think about how it destroys a 13-year-old girl to have a fake nude photo of her passed around. The abuse that we politicians and many others receive from fake and anonymous accounts needs to be addressed. Seventy-one per cent of British people consider this to be a problem, and we need to take action. AI chatbots are another thing that was not foreseen in the development of the Online Safety Act, and therefore it is far behind on them, too.
The Online Safety Act is in no way perfect, but it is a good step forward. We must learn the lessons of its implementation to go further and faster, and listen to British parents across the country who want the Government’s help to protect our children online—and we as a Government must also protect our democracy online.
I do not disagree with the hon. Lady. There are a whole host of issues around porn bots and AI-generated bots that have now also sprung up. We know that we are committed to the Online Safety Act and its review as its being implemented. As technology moves on quickly, we have to keep pace with what the harms are and how we are able to deal with them. I thank the hon. Lady for raising those particular issues.
We will act on the evidence that comes forward. It is clear that if the evidence shows us that we have to act in various areas, including chatbots, we will do so. The Secretary of State announced plans to support a child safety summit in 2026, which will bring together tech companies, civil society and young people to shape how AI can benefit children and look at online harms and the movements on those.
Emily Darlington
I wanted to raise with the Minister that the Science, Innovation and Technology Committee will be undertaking an inquiry in the new year on brain development, addictive use and how that impacts various key points in children’s development. The Minister says that he will look at all evidence. Will he look at the evidence produced by that inquiry to ensure that its information and advice goes to parents across this country?
Emily Darlington
I appreciate what the Minister says—that these powers are in legislation—yet the process is still the social media platforms marking their own homework. We are in a vicious circle: Ofcom will not take action unless it has a complaint based on evidence, but the evidence is not achievable because the algorithm is not made available for scrutiny. How should Ofcom use those powers more clearly ahead of the elections to ensure that such abuse to our democracy does not occur?
A whole host of legislation sits behind this, including through the Electoral Commission and the Online Safety Act, but it is important for us to find ways to ensure that we protect our democratic processes, whether that be from algorithmic serving of content or foreign state actors. It is in the public domain that, when the Iranian servers went dark during the conflict with the US, a third of pro-independence Facebook pages in Scotland went dark, because they were being served by foreign state actors. We have seen that from Russia and various other foreign actors. We have to be clear that the regulations in place need to be implemented and, if they are not, we need to find other ways to ensure that we protect our democracy. At a small tangent, our public sector broadcasters and media companies are a key part of that.
To stay with my hon. Friend the Member for Milton Keynes Central (Emily Darlington), she made an excellent contribution, with figures for what is happening. She asked about end-to-end encryption. We support responsible use of encryption, which is a vital part of our digital world, but the Online Safety Act does not ban any service design such as end-to-end encryption, nor does it require the creation of back doors. However, the implementation of end-to-end encryption in a way that intentionally binds tech companies to content will have a disastrous impact on public safety, in particular for children, and we expect services to think carefully about their design choices and to make the services safe by design for children.
That leads me to online gaming platforms and Roblox, which my hon. Friend also mentioned. Ofcom has asked the main platforms, including Roblox, to share what they are doing and to make improvements where needed. Ofcom will take action if that is not advanced. A whole host of things are happening, and we need the Online Safety Act and the regulations underpinning it to take time to feed through. I hope that we will start to see significant improvements, as reflected on by my hon. Friend the Member for Sunderland Central.
My hon. Friend the Member for Milton Keynes Central mentioned deepfakes. That issue is important to our democracy as well. The Government are concerned about the proliferation of AI-enabled products and services that enable deepfake non-consensual images. In addition to criminalising the creation of non-consensual images, the Government are looking at further options, and we hope to provide an update on that shortly. It is key to protecting not only our wider public online but, fundamentally, those who seek public office.
The Government agree that a safer digital future needs to include small, personally owned and maintained websites. We recognise the importance that proportionate implementation of the Online Safety Act plays in supporting that aim. We can all agree that we need to protect children online, and we would not want low-risk services to have any unnecessary compliance burden. That is a balance that we have to strike to make it proportionate. The Government will conduct a post-implementation review of the Act and will consider the burdens on low-risk services as part of that review, as mentioned in the petition. We will also ensure that the Online Safety Act protects children and is nimble enough to deal with a very fast-moving tech world. I thank all hon. Members for providing a constructive debate and raising their issues. I look forward to engaging further in the months and years ahead.
(1 month, 3 weeks ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
Jo White (Bassetlaw) (Lab)
Thank you, Mr Turner. Wow, that is a big announcement!
Just over a month ago I visited Tallinn, the capital of Estonia, a country that has been using digital ID for 30 years and a country we can learn from—how it works, how it reaches the digitally excluded and how it protects people’s security. What struck me most was that everyone I spoke to said the same thing: with digital ID, they know exactly what information the Government hold on them, and most importantly, they know who has looked at it and why.
That level of transparency and personal control should be the gold standard, but here it often feels the opposite: social media giants and private companies know more about us than we realise—often more, I would say, than our nearest and dearest. We need to have absolute control.
Emily Darlington (Milton Keynes Central) (Lab)
It is interesting that my hon. Friend talks about the Estonian experience, as I often hear my constituents’ frustration that they do not know what the Government are doing with their data, and how they even have trouble accessing it. Does my hon. Friend think that a scheme like Estonia’s would help the citizen to be in charge?
Jo White
I totally agree with my hon. Friend.
From the moment we are born, the state begins to gather data: our birth is registered; the NHS stores our health records; we are issued with national insurance and NHS numbers; and His Majesty’s Revenue and Customs tracks us. By having a digital ID, we can see the information the state holds on us, who has been accessing it and why. We can even determine that other people cannot see our data. It is about us having control over our own data.
It is also about security, because the way it is divided and split up means there is absolute security as nobody can see data from one Department to another. It is about people having personal control, which is what people in my constituency are calling for.
Emily Darlington (Milton Keynes Central) (Lab)
It is a pleasure to serve under your chairmanship, Mr Turner. For reasons of timing, I will not repeat what my hon. Friend the Member for Bassetlaw (Jo White) said about the important change in the relationship between citizen and state that could come from digital ID—putting the citizen in charge rather than the state knowing too much about us without our knowing what they know.
However, there is another reason why we might want a free, digital, Government-backed ID: £11 billion is lost each year to fraud, and ID theft costs us about £2 billion a year. People need to prove who they are at each and every moment. For too many people, that involves a passport or driver’s licence, which is not affordable for many. Having an ID that allows us to prove who we are could be more secure. We will also need it to show that we can work—there has been a 40% increase in illegal working—and to prove our age, including for the big changes made by the Online Safety Act 2023.
Peter Swallow (Bracknell) (Lab)
My hon. Friend raises the Online Safety Act. Some of my constituents have raised concerns about identity checks to access material online. Would it not have been far easier to prove one’s age online safely and securely if we already had a digital ID, and would that not have helped us to introduce safer checks online?
Emily Darlington
My hon. Friend is absolutely right. All the complaints I have received are about people giving their information to third-party verifiers. If they had a free, digital, Government-backed ID, they could have proved their age to access any over-18 content. People are also concerned that those who should not be accessing the NHS are doing so. The reality is that if there were a Government-backed digital ID, it would be clear whether a person can access the NHS.
I have come up with a list that debunks what the hon. Member for Perth and Kinross-shire (Pete Wishart) said, and I am happy to pass it to him afterwards. I think we need to add a few scientific facts, but I do not have time.
Emily Darlington
I am happy to go through it. First, it is not about centralising data. Rather, digital ID allows the citizen to access federated data. The data stays in the individual Departments; it does not stay on a card—this is not about a card. Digital ID adds a level of security to Government datasets. There is no travel or location data. There is no access to external providers. It uses sovereign tech that allows citizens to know what the Government hold and who is accessing it. There is no new data that the Government do not already hold, and a single login is actually better for a person to prove who they are with a digital ID.
Emily Darlington
I think the right hon. Member will find there is a split in the community because there is a lack of detail.
Emily Darlington
I agree, but there is a lack of detail. When we are at the beginning of the conversation and going out to consultation, which is exactly what we are doing, we have to ask the public what they want. Do they want either of the two scenarios that my hon. Friend the Member for Bassetlaw and I presented, or do they not want access to their Government data in a way that enables them to know what is happening, and so that they can prove who they are without having to pay for a passport or driver’s licence?
Iqbal Mohamed (Dewsbury and Batley) (Ind)
I thank the hon. Member for Perth and Kinross-shire (Pete Wishart) for securing this debate. This is one of the most controversial and divisive issues currently supported by the Government, who have form. I am here on behalf of my constituents, as nearly 100 have written to me opposing the scheme, and nearly 4,000 have signed the e-petition.
We have heard the risks and the issues around data privacy, surveillance culture, user profiling, exclusion, focus creep and scope creep. Having worked in the IT industry for over 20 years, as well as in the cyber-security industry, I can say that there is no safe system at the moment. Relying on third-party software, owned by foreign states or companies—
Emily Darlington
Is the hon. Member aware of the Government’s statements that the system would be held internally and use sovereign tech?