Online Harm: Child Protection

Kirsty Blackman Excerpts
Tuesday 24th February 2026

(6 days, 23 hours ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

I have given way to the hon. Member a couple of times. I am just about to finish.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I am confused about what the Liberal Democrats’ proposals are. The proposals laid out by the hon. Lady are not those introduced in the House of Lords. In the House of Lords, only user-to-user services were talked about, not addictive online gaming, for example. Are we discussing a Bill containing the proposals laid out in the House of Lords, or is the hon. Member putting forward new, ethereal proposals? I do not understand what this Bill is going to be. I was expecting to actually see it so that we could discuss it today.

Munira Wilson Portrait Munira Wilson
- Hansard - - - Excerpts

The harms-based framework that we proposed in the other place would apply to chatbots and gaming as well. The point is that, as I have already laid out, we would come together and come forward with proposals that we can all agree on.

--- Later in debate ---
Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I will simply repeat the point I have made, which is that we are going to act by the summer. We have already sought permissive powers to ensure that the Government are able to act on the outcome of the consultation through rapid legislation. I hope the combination of those two commitments gives the hon. Member some assurance.

The engagement and consultation will take place alongside work with counterparts. We will be monitoring developments in Australia on its social media ban for under-16s to share learnings and best practice. We are steadfast in our belief that the right way to deliver the next steps to protect our children online is to be led by the evidence through our short, sharp three-month consultation.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister has just said that the Government have already sought permissive powers. I understand that they are going to move an amendment in lieu to the Children’s Wellbeing and Schools Bill, but I am not aware that that amendment has been published yet, much less agreement sought from the House. When will that be published, so that we can see what those permissive powers are supposed to be?

Kanishka Narayan Portrait Kanishka Narayan
- Hansard - - - Excerpts

I thank the hon. Member for that point, and commit to her that we are going to try to do that as soon as possible. She will be aware that the legislative process is already very tight, so I will come back to her and the House with the wording of the motion as soon as possible.

Last week, as I have mentioned, the Secretary of State confirmed that we will take new legal powers to allow us to act quickly on the outcomes of the consultation, delivering on our promises to parents. We will make sure that the wording is presented to the House at the earliest opportunity. We also recognise the importance of parliamentary scrutiny and the expertise that parliamentarians in both Houses provide, and have already committed that when regulations are brought forward, they will be debated on the Floor of the House and there will be a vote in both Houses, ensuring proper scrutiny. We are clear that the question is not whether we will act, but what type of action we will take. We will ensure that we do so effectively, in lockstep with our children and in the interests of British families.

--- Later in debate ---
Julia Lopez Portrait Julia Lopez (Hornchurch and Upminster) (Con)
- View Speech - Hansard - - - Excerpts

Today we are debating something that is very important: the protection of children from online harms is vital.

I commend the hon. Member for Twickenham (Munira Wilson) on what I thought was a very heartfelt speech, but I fear that her good intent has been rather thrown under the bus by her party leadership. Setting aside the importance of this subject, let us look at their method of bringing it forward—a point which has been raised rather expertly by Members from across the House. Today the Liberal Democrats are doing what they do best: slightly nutty stunts. With all the menace of Captain Mainwaring they are attempting to seize control of the Order Paper and effectively declare themselves not only Government for the day but, with their loosely defined online services Bill, rulers of the internet. It is a gimmick. It is the parliamentary equivalent of boinging into the Chamber on a giant bungee.

Though the hon. Member for Twickenham put a little bit of flesh on the bones in her speech, the motion itself simply requests the power to barge through this House with a blank-cheque Bill for which we have no details and in so doing let the Government Benches clean off the hook. It has all gone a bit Benny Hill. It is a great shame because it is a distraction when the moment of truth on social media for children is coming to us imminently. They know that from the panicked recess briefings that the Prime Minister has been caught on the hop on an issue that is of deep concern to families, children, teachers and communities across the country.

Before too long the Children’s Wellbeing and Schools Bill will return to this House and Members will have the chance to vote on a credible proposition: an amendment tabled by the noble Lord Nash that no child under the age of 16 should have access to harmful social media.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

If this is the Conservative’s stance, why when consideration of the Online Safety Bill lasted for so long—it was even referred back into Committee, which no Bill had been in 20 years—did the Conservatives not ban social media for under-16s through that Bill when they were in government?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

This is a Conservative amendment in the Lords that has gained cross-party support, so it will be coming back to us. The hon. Member raises an important point about why this policy was not brought in under the Online Safety Act. That Act tried to do many, many things. In many ways, it took so long because it risked becoming a Christmas tree Bill, and many good causes were hung off it. That did cause challenges.

I think that as the debate has moved on we have realised that it is not just about illegal content that children are being exposed to and some of the things that the Online Safety Act was trying to change. There is an issue in general about children being in this space: there are addictive algorithms, and it is not just about illegal material but the fact that it is changing how children are thinking about interacting. Maybe we have to stand back as a society and say, “This is simply not the right place for children to be. We can create adult online spaces, but for children we think that there are other ways in which they should be interacting with the world.”

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I can agree with that. My point is that this Government are trying to suggest that a consensus can be found in the absence of their having a policy position. They are talking about a consultation, but what on earth are they consulting on? Nobody has a clue. They have not been able to say anything about what they actually want to do, because the Prime Minister has no opinions, which is why he is in such deep trouble. Those on the Labour Benches can get out of their tree and get all uppity about it, but this—[Interruption.] No, the Prime Minister is being blown around like a paper bag on this issue, and everybody knows it. First of all, he said that his children did not want to ban social media; now he says that his children are the reason why he wishes to ban social media. He said there is going to be a consultation, but it has not materialised. What does this man actually think?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I am glad that the hon. Member has been very clear that her position is that she supports the Lords amendment that seeks to ban social media for children. Is she aware that it would not apply in Scotland? The Lords amendment would not apply in Scotland, because the territorial extent of the Children’s Wellbeing and Schools Bill, apart from one clause, does not include Scotland. I take it that her position is that she only wants a social media ban for children who do not live in Scotland.

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure the applicability of the legislation in Scotland is something that can be debated when the Bill comes before the House.

To give them credit, many Labour MPs understand the fact that there is an absence of any Government position, and they will not be taking their foot off the pedal. I suspect that many may have the guts to speak out today—although perhaps not. Those MPs recognised immediately that a consultation is a mechanism for a delay that goes beyond the summer and into another parliamentary year before the sniff of legislation. That holding position is now falling apart, as we have seen from the Minister here today. It is the threat of a very large group of Labour MPs backing the Conservatives’ Lords amendment that is pushing this Government into action—it is government by rebellion. We ask the Liberal Democrats not to let us be distracted from the moment of truth that is coming up, when we hope there will be cross-party support for the noble Lord Nash’s amendment.

For too long, the internet has been treated as a space that cannot be governed. It has functioned like a pioneer society, with extraordinary opportunity but minimal rules. However, pioneer societies improvise customs and eventually retrofit themselves with rules to sustain societies, often after hard-won experience and dispute. That is the process through which we are now going, and we are realising that, as the online society was built, we were not vigilant enough when it came to protecting childhood. We did not recognise that this new territory would bleed into the old world. [Interruption.] The Minister is shouting from the Front Bench that I am embarrassing myself. We as a Government brought forward the Online Safety Act, but there are gaps in it, and we have taken a clear position as the Opposition that we think children should not be on social media. He is looking very angry, but what is his view? Can he stand up and tell us what his personal view is? As the Minister with this responsibility, what does he think should be done, having launched his consultation with such earnestness? Come on, tell us! Would he like to tell us?

--- Later in debate ---
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I am sure that the issue of the functionality list can be explored as time goes by.

It is important to point out that this is not a moral panic but a structural problem. Today the Leader of the Opposition gathered a panel of grieving parents who had lost their children, and in that context negative online activity was recognised to have real-world and utterly tragic consequences. The children had been drawn into dangerous challenges, coercive relationships, bullying and bribery, all of which created despair in those young minds.

That showed us plainly why the pioneer phase must now come to an end, at least where children are concerned. Pioneer societies do not remain lawless forever; eventually they are retrofitted with rules and boundaries, and protections for the vulnerable. It is striking that, after years of the problem building up, countries around the world are reaching the same conclusion with remarkable synchronicity—not because it is fashionable, because Governments are copying one another or because anyone thinks that this will be particularly easy to impose and enforce, but because the evidence has accumulated to a point at which denial is no longer credible. If social media were broadly harmless for children, this would not be happening, but Governments with very different. political traditions are acknowledging the same reality: that when it comes to children, some control must be wrested back. I suspect that this trend will be reflected vividly in the Chamber today, with examples from across the nation of what is happening in the real world because of the laxity in the online world.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I asked the hon. Lady’s Government to ban suicide forums that encourage young people to harm themselves. I asked her Government to ban eating disorder forums that encourage eating disorders. Her Government refused to do that in the Online Safety Act 2023, despite our asking for it to happen. How can she stand there now and take the moral high ground when her Government refused to ban the worst, most egregious, most harmful platforms? The Conservatives do not have a moral high ground on this issue.

--- Later in debate ---
Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

I absolutely agree. Young people, particularly those in the mid-teenage years, understand this issue in a way that sometimes we do not because, quite frankly, our online experience is completely different from theirs. If Members want to test that, they should open an app such as Pinterest and compare what is fed into their Pinterest boards with their child’s Pinterest boards. It is a completely different experience. If Members do not have children, they should ask younger member of staff to open the same app on the different phones, and they will see a completely different world.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

A local organisation in my constituency, CyberSafe Scotland, surveyed children about what they were being fed on TikTok. There is a road in my constituency called North Anderson Drive, and children on one side of North Anderson Drive were being fed different content to the children on the other side of it. It is not just an age thing; it is really specific, and we cannot understand what each individual person is seeing because it is different for everybody.

Emily Darlington Portrait Emily Darlington
- Hansard - - - Excerpts

That is a very important point about how sophisticated the technology has become. When we ask companies to take action to stop outcomes, the technology exists to do that. We are not asking them to reinvent the wheel or come up with new technology. It already exists because they are even microtargeting two different sides of the road.

Having discussed this with experts, parents and—most importantly—young people, what do I think we need to consider? First, we need to fully and properly implement the Online Safety Act 2023. That must be done at speed, and it requires nothing from the House. It has been a request of the Secretary of State and the Minister, and I recommend that Ofcom gets on and does that as quickly as possible. We must make safe spaces for children online. How do we do that? Part of the answer is ensuring that content is related to ratings that we already understand as parents, such as those from the British Board of Film Classifications. I have been asking YouTube what rating YouTube Kids has for about a year now. Is it rated U? Is it 12A? Is it 15? It cannot tell me because it does not do things on that basis.

As a parent I want to know the rating before allowing my children on an app, because parents have a role in this as well. All apps should be rated like videogames. Roblox has a 5+ rating, which does not exist in videogame ratings. We see ratings such as 4+ or 9+, but those are made up. At the parents forum that I did after the survey, one parent said that she walked in on her nine-year-old playing “guns versus knives”—on an app that is rated 5+. The ratings on apps mean nothing, yet we have video game ratings that we as parents understand, so why are they not used? Should in-app purchases ever be allowed for young children? What is the age at which in-app purchases should be allowed in a game?

We must consider the time limits for the different stages of brain development. We have guides on fruit and vegetables that recommend five a day to parents. We all know that. Schools use the same language, we use the same language, yet we have nothing to support parents in deciding how long a child should be online at different stages of brain development. I hope that the evidence that the Science, Innovation and Technology Committee collects will help inform that.

We need to change addictive and radicalising platform algorithms. To protect children from child sexual abuse images, we need to talk to those behind iOS and Android to stop the creation of self-generated child sexual abuse images—some 70% to 80% of child sexual abuse images are self-generated—and we need to stop end-to-end encryption sites from sharing them. We have technology that can do that. We should always keep the ability to ban in our pockets, but any ban should be for particular apps. We should not ban our children and young people from having an online experience that is good.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

This makes me more frustrated than just about anything else in this place: the levels of ignorance, stupidity and hypocrisy from so many people in here, specifically about children’s access to social media. I fully intended to support the Lib Dems’ position, but the longer their spokesperson, the hon. Member for Twickenham (Munira Wilson), spoke, the less I wanted to do so.

I do not believe that the Government’s position on this is 100% right. I am glad that they are having a consultation, but I do not like the way that they are amending the Children’s Wellbeing and Schools Bill, which is a devolved Bill, to change the territorial extent to bring that into scope. A Bill that we have not scrutinised, because it is a devolved Bill, will now have a reserved section in it. At the moment, the Bill does not apply to Scotland, apart from one clause; now, it will apply to Scotland, because it will include this. As we have not had the opportunity to scrutinise it, we have not been involved in that process. I do not think it is right that the proposed amendment should come forward in this way, although I appreciate why the Government are doing it. That is why I am asking for the amendment to be shared with us as soon as possible so that we can see it, because we have not had a chance to look at the Bill as it has gone along.

The Lib Dems have said that they have made their position clear. I have so far been able to find three amendments and new clauses to the Online Safety Act put forward by Lib Dems during its passage through the House. One of them was put forward by the Lib Dem spokesperson, who asked for an independent evaluation within 12 months of whether more platforms should be subject to child safety duties—this is the same party that is currently accusing the Government of kicking the can down the road, despite asking for a 12-month independent evaluation. There is hardly anything in the Lib Dems’ previous positions that helps me to understand their current position.

The Tory party’s position is totally incoherent, too. The Tories refused my amendment on reducing habit-forming and algorithmic features. They also refused my amendments on livestreaming.

By the way, before the Minister’s “Dear colleague” letter, livestreaming had been mentioned 53 times across the two Houses. A third of those mentions were me talking about how livestreaming for children should be banned. Before today, Roblox had been mentioned 32 times across both Houses—15 of those mentions were me saying that Roblox is not a safe platform for children.

I am massively in favour of improving the online world for children. I think social media should be about looking at videos of cats. I love videos of cats—they are absolutely brilliant. That is what it should be for. I also think it is a great place for children to interact with one another.

Like some others in the Chamber, I have been making the case that there are dangers on social media that can be easily tackled by changing the Online Safety Act. We could have got rid of those algorithmic features for children, for example. We could have got rid of livestreaming for children through the amendment I tabled. We could have got rid of children’s access to private messaging features with people they do not know, through another amendment that I tabled to the Online Safety Act.

I do not like the way the Government are doing this, though. They are proposing an amendment to the Children’s Wellbeing and Schools Bill, and then we will have secondary legislation that will, possibly, amend the Online Safety Act—I am not 100% clear on how it is going to go. I appreciate that there needs to be a consultation.

Before the 2024 Parliament, there were about three people in this entire place who had any grip of what the online world might have been like for children. One of them was the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah), who talked about some of these things. I asked the Minister at the time whether Fortnite would be included in the scope of the Online Safety Act, and they said, “If there’s text chat.” Text chat in Fortnite—it is an online game! There is not enough expertise in this place. Much as I hugely appreciate the people who work on writing Bills and the work of some of the experts at Ofcom, they are not experiencing the online world that children are experiencing. That is why we need to listen to ensure that any changes that are made tackle the most harmful behaviours, places and functionalities on the internet.

I appreciate that the Government are trying to take action on this now. However, one of the few things that has made me cry in frustration in this place was one of the first things this Government did when they came in, when they brought in secondary legislation to categorise platforms and refused to include the small, high-risk platforms that had been added in the House of Lords. They said they were categorising as category 1 only platforms like Facebook, which meet a certain threshold. I was so frustrated by that choice by the Government.

There needs to be more listening and learning about where the actual dangers are, and taking action on them. Please, do that in consultation with those of us who do understand this. Please, listen to experts on this.

--- Later in debate ---
Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - - - Excerpts

I have been quite shocked at some of the procedural discussion for several reasons. First, we are acting like this has just come up, but even in the House of Commons under this mandate, as my hon. Friend the Member for South Devon (Caroline Voaden) mentioned, the safer phones Bill was put forward in 2024. As Liberal Democrats, we put forward amendments to change the age of data consent to ban addictive algorithms. There have also been calls to act on doomscroll caps, and we have highlighted the harms of AI chatbots. Yet we are at a point—I absolutely respect what the hon. Member for Aberdeen North (Kirsty Blackman) was saying on this—where a consultation was proposed by the Government over a month ago, but we still do not know the details. There are things going through the House of Lords that, again, we do not know the details of. At the very least, Liberal Democrats are trying to give the space for that and say, “Yes, we need to start putting forward that legislation.” If there is another chance to debate that, what is the harm in this motion because this is such a crucial issue?

Secondly, it is not as if this is an issue that turned up yesterday. As the hon. Member for Newcastle upon Tyne Central and West (Dame Chi Onwurah) talked about, these harms have been happening for years—over 22 years for Facebook. I will go on to say more about that in a moment. Other countries around the world are showing leadership on this and saying that we have to act now. My point is that at the very least, a consultation could have been launched earlier. This is not something new in this Parliament. We are saying that action needs to be taken.

Most importantly, the parents, children and experts watching this debate want to see us taking this issue seriously. Children and young people are at the heart of this. I think back to the first time I met some of the sixth-form students at Ashlyns school in Berkhamsted. I will never forget sitting around that table with one sixth-former—let’s call him James. He told me about his fears for the mental health of his friends. He warned about the self-harm that he was seeing among his peers, which his teachers were not even aware of, and he talked about the role of social media. A few weeks later, I was pulled to one side at St George’s school in Harpenden, where some young women shared with me their concerns about the growing misogyny lived out by young men, which started on social media.

Since then, I have carried out a “Safer Screens” tour meeting young people. Students have talked about brain rot and seeing the extreme content that the algorithm continues to push on them, even when they try to block it—the hon. Member for North West Cambridgeshire (Sam Carling) talked about that. One student said, “It is as addictive as a drug”, and they see the harms of it every day.

This is the tipping point, and I am surprised that many Members think that it is not. This is that moment. Parents, teachers, experts and even young people are crying out for action, and have been for a long time, to tackle the social media giants that have no care for their mental health. As I said, this tipping point has been years in the making. Facebook was launched 22 years ago. Indeed, a Netflix documentary from six years ago started to highlight the warnings from people who worked in tech about social media. One expert said that it is

“using your own psychology against you.”

Having worked in tech myself, I have read the books and received the training on how these social media giants get us hooked—it is built in.

Awareness is growing. I thank Smartphone Free Childhood, Health Professionals for Safer Screens, the Molly Rose Foundation, the Internet Watch Foundation and the Online Safety Act Network, along with projects such as Digital Nutrition—the hon. Member for Milton Keynes Central (Emily Darlington) and others have made the analogy of an online diet—that have worked to ask what the guidance should be. Those are just a few of the organisations I could name that have worked tirelessly to ensure these voices are heard.

I also thank pupils in my constituency from Roundwood Park, St George’s, Sir John Lawes, Berkhamsted and Ashlyns schools, and students who have openly shared their experiences, hopes and concerns about the online world. Their concerns are not just about content; they are also about addiction. Let me be clear: as my hon. Friend the Member for Mid Dunbartonshire (Susan Murray) mentioned, the core of this issue is that this is the attention economy, so our children are the product. Their attention, focus and time are being sold to line the pockets of tech billionaires. Governments around the world are taking finally action. This is a seatbelt moment where we need to say, “Enough is enough.”

The hon. Member for Stoke-on-Trent Central (Gareth Snell) talked about trying to get this right. I respect that, but I often think that if we were able to walk down the street and see a 3D version of what young people are seeing in their online world, action would have been taken much sooner. My hon. Friend the Member for Eastleigh (Liz Jarvis) talked about holding tech companies to account. We need to start unpacking what children are seeing and finally take action.

The Online Safety Act has done great work, but it does not go far enough. It sets out illegal harms and a code for inappropriate content for children and over-18s, but not a framework of legal harms or age-appropriate content. The social media age of 13 is based on data processing that is managed by the Information Commissioner’s Office and has nothing to do with what is age-appropriate in that context. Dr Kaitlyn Regehr, the author of “Smartphone Nation”, talks about how the Act is reactive, not proactive, and leaves it up to the user to report problems rather than putting the burden of safety on tech giants.

We must ensure that we build on the OSA and learn the lessons from Australia. The hon. Member for Milton Keynes Central talked about this. In Australia, a wide definition of social media has left it to a small group to decide what is appropriate. That has meant that YouTube has been banned for under-16s, but YouTube Kids has not, with no real framework for why apart from the fact that they deem YouTube Kids safer. WhatsApp has not been banned, which is possibly the right thing, but legislators are left to play whack-a-mole as new social media apps pop up. There is no framework for harm from AI.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the hon. Member give way?

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Very briefly; I want to leave the Minister time.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Australia just bans children from holding accounts; it does not ban them from using any of the platforms. They can still use YouTube; they just cannot have an account.

Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

Absolutely. YouTube is everywhere. It is embedded in almost every website that has videos.

The hon. Member for Aberdeen North (Kirsty Blackman) asked about AI chatbots. In the proposals we put forward in the Lords, the user-to-user services are the AI chatbots. We have highlighted for a long time that potential harms from AI chatbots are not covered. That is absolutely the case, but Ofcom has clarified that AI chatbots are the user-to-user service. The harms, such as AI psychosis, which my hon. Friend the Member for Winchester (Dr Chambers) alluded to, are not covered. That is why the harms-based approach we are putting forward is so important.

As my hon. Friend the Member for Twickenham (Munira Wilson) said when she opened the debate, the Liberal Democrats have been leading the work on online safety in this Parliament. We were the first party to push a vote on banning addictive algorithms. We have called for health warnings and a doomscroll cap. Today, we are calling for a vote on the age for social media and online harms. We are calling for a ban on harmful social media based on a film-style age rating. That harms-based approach holds tech companies to account, sets a pioneering approach to online standards and prepares for the future of AI chatbots and games like Roblox, which has already arrived.

In the offline world, anyone buying a toy for young children at this point would expect age ratings so that they know it is appropriate and safe, and films have had age ratings for over 100 years, yet we have not had that in the online world. The harms-based approach is backed by 42 charities and experts who work to protect children, stop violence against women and girls and make the internet a safer place.

We are also calling for a reset, because enough is enough. That includes a minimum age of 16 for social media and real accountability for tech companies with film-style age ratings. We need to make sure that we get the best out of the internet for young people and protect them from harms.

For me, it comes back to James, his friends and the young women and children I have spoken to around my constituency. We do not have time to waste—that is why we are pushing for these Bills. We are calling for action, and I call on MPs across the House to put children before politics, exactly as we did in the Lords. The amendment in the Lords could mean a blanket ban. We were uncomfortable with that approach—we much prefer ours—but we knew that the future of children came first. We must help the next generation to get the best of the online world—including those young people who have spoken out and shared their concerns and horror stories—and protect them from the worst of it.

Science and Discovery Centres

Kirsty Blackman Excerpts
Wednesday 14th January 2026

(1 month, 2 weeks ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for keeping us all in line today, Mrs Harris—although I think this is a fairly good-tempered debate, because we are all in favour of our local science centres. I congratulate the hon. Member for Montgomeryshire and Glyndŵr (Steve Witherden) on introducing the debate. It was really cool to hear about the CAT; it is one of those science centres that has been around a bit longer than the others, but it sounds as though it is doing absolutely amazing work.

That is one of the things I wanted to touch on: although this is a network of science centres, they are all different. They all work for the benefit of their local communities, looking at the innovation, technology and science that makes the most sense for those, rather than the Government’s priorities, because that is how it should be. They should be capturing the imagination of the people in the local area, and they can do that only if it is relevant and if they are able to keep moving with the times and catching that imagination.

In Aberdeen we have the Aberdeen Science Centre, which was in my constituency until the boundaries changed. It was refurbished in 2020 and is in a stunning building—it was an old tram shed, so it looks really cool—but it was first opened in 1990 on a different site, and next month it will be 36 years old. I think I first visited the Aberdeen Science Centre, which was originally called Satrosphere, before I even went to school. It has always been part of the fabric of our city. Everybody goes there as a schoolchild; it is a place that everybody goes along to and visits, and that everybody knows about.

When the centre moved to the new premises, it suggested getting rid of one of the exhibits, which is a sheep: visitors press the button and the sheep eats the food, and it goes around the sheep and then something comes out the other end. There was uproar from the parents of the children who currently go to the centre, saying, “How dare you get rid this exhibit that we loved when we were children?” The comment that the hon. Member for Montgomeryshire and Glyndŵr made—that a visit as a child can have a lifelong impact—absolutely resonates. Everyone who has had those science centres in their life for as long as I have will remember those visits when they were a child.

I sometimes find it difficult in debates when we talk about the economic impact of these things or the innovation they drive. We could also just talk about the fact that they are joyful places to be. We do not have to justify an art gallery on the basis of its economic impact; we can justify these science centres on the fact that they create curiosity and joy in children—and adults. I love going to science centre; it is very difficult to get me out of Aberdeen. I come to London for work, but I do not like leaving my city; it is the best place on earth. However, I say to my hon. Friend the Member for Dundee Central (Chris Law) that the Dundee Science Centre is one of the few places that I would trail to with my children when they were little, because it is absolutely excellent. It had diggers that they could play with, and my little boy, who was tractor-obsessed, completely loved going to visit.

As the hon. Member for Winchester (Dr Chambers) was saying, in this time when people are willingly denying facts and we are fed up of experts, having that hands-on experience of science and actually talking about how the earth moves and the way that climate change is changing our society and creating extreme weather events, or about the industries in the local areas and the science that fuels them, gets the next generation of people excited about those things. It gets them thinking about how those things work in a way that the school classroom cannot always manage. Sometimes it does—sometimes we are lucky enough to have an inspiring teacher who can make us think and consider the future; there are many of them out there—but going and getting hands-on in a science centre is something really special.

Lastly, on the differences that there are, our science centre in Aberdeen has, in recent times, covered climate change and has had a link with NASA when it had a spacecraft made in the science centre. It is currently running a Demystifying AI programme and there are some ridiculous photos of me in the science centre trying out virtual reality, because I always get super excited by it. Given the importance of these centres to us and all our constituents, the changes they make in people’s lives and their lifelong impact, it is reasonable for us to ask two things: please look at funding, and please choose a Government Department. It does not cost the Government anything to do that. Just choose one—and champion these centres.

Social Media: Non-consensual Sexual Deepfakes

Kirsty Blackman Excerpts
Monday 12th January 2026

(1 month, 2 weeks ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

Will the Secretary of State consider adding AI responses to prompts by users to the definition of user-generated content, so that that is included in the scope of the Online Safety Act? Will she or one of her Ministers meet me to discuss my concerns about the risks posed to children of children being able to livestream?

Liz Kendall Portrait Liz Kendall
- View Speech - Hansard - - - Excerpts

I will definitely meet anyone who has evidence about that and what we need to do.

Pride Month

Kirsty Blackman Excerpts
Monday 23rd June 2025

(8 months, 1 week ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

Happy Pride to everyone in Aberdeen, in Scotland and in every part of these islands. The world is a pretty scary place right now in a huge number of ways. As the Minister said, LGBT people are normal just like the rest of us, and they are similarly feeling scared about the state of the world, not least in the wake of the Supreme Court decision and the impact that is having on trans people. The Minister said:

“Our hard-won freedoms are never won in perpetuity”,

and it is the case that trans people’s rights—their right to a private life, and their right to human rights—have been rolled back as a result of this reinterpretation of the Equality Act 2010. People are less able to live their lives with the freedom they should be able to have, and the Government need to do something about that.

People keep using the word “clarity” about the EHRC guidance, but there is no way that it provides that. It requires trans people regularly to out themselves. They may still have protection on the basis of gender reassignment, but trans women no longer have protection as women as a result of this reinterpretation of the Equality Act, and that is not the way it should be. All I can say to my trans constituents and people across the United Kingdom is that I am sorry; we need to get this sorted and we need to keep fighting.

On the decision taken by a number of Pride organisations, we have no entitlement to be there. There is no entitlement for political parties to be allowed to take part in Pride. For all that that we have done great things 50 years ago, 25 years ago, five years ago or two years ago—for example, there were the changes with the recourse provided to LGBT veterans—that does not mean that we should not be held to account for our lack of action, for failing to protect trans people properly or for the increase in hate crimes that we are seeing. It is absolutely right that Pride organisations should be able to use their voice to say to every one of us in this House, “You are not doing good enough. You need to do better. We need you to do more in order to protect the community.” If that is the way they choose to use their voice, they should absolutely go for it.

I took part in our Pride in Aberdeen, as I have done on many occasions, including the first one 25 years ago. I marched with the crowd, as I always do in the Pride parade—not with Out for Independence, but with all the people I represent. It is the case that every one of us in this place needs to do better. We need to improve lives for our trans constituents. We need to fight this rolling back of rights, because people are terrified, and they are right to be pretty scared right now.

--- Later in debate ---
Samantha Niblett Portrait Samantha Niblett (South Derbyshire) (Lab)
- View Speech - Hansard - - - Excerpts

I had not intended to say what I am about to say, before the bit that I did intend to say, but, inspired by the Minister’s comment that coming out matters, I thought I would use this very public forum to say that I am a bisexual woman. Some people know; some do not. I do not wear it like a badge any more than I would expect a heterosexual person to walk around saying, “Hey, guess what, I’m attracted to men”—or women, depending what gender they are, or otherwise.

The reason that I feel compelled to mention that publicly, before I get on to the good bit of my speech—please, somebody, intervene on me—is that I held back on showing my support for the LGBTQ+ community on my Facebook page for fear of retribution ahead of the local elections, when a certain party got into power at Derbyshire county council. Trying to appease that kind of support did not win any votes, so after that happened, I doubled down on what I believe in and who I am. I posted in support of the Day Against Transphobia, Biphobia and Homophobia, and I said, “If you dare make a negative comment, or anything alluding to one, such as, ‘What is a woman?’, you will be blocked from my page, because there’s falling on the right side of history and there’s falling on the wrong side of history, and you are wrong.”

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I congratulate the hon. Member on taking this opportunity to say what she has said. It is not easy to say something like that in a Chamber like this. Having done something similar not that long ago, I absolutely respect her, and I join her in celebrating Pride month.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

And have an extra minute.

Online Safety Act: Implementation

Kirsty Blackman Excerpts
Wednesday 26th February 2025

(1 year ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I thank you for chairing this debate, Mr Stringer, and I congratulate the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright) on bringing this debate to Westminster Hall. It is a subject we have talked about many times.

I want to make a number of points. The first is about safety by design. Page 1 of the Act states that the internet should be “safe by design”, yet everything that has happened since in the Act’s implementation, from the point of view of both Ofcom and the Government in respect of some of the secondary legislation, has not been about safety by design. It has been about regulating specific content, for example, and that is not where we should be. Much as I was happy that the Online Safety Act was passed, and I was worried about the perfect being the enemy of the good and all that, I am beginning to believe that the EU’s Digital Services Act will do a much better job of regulating, not least because the Government are failing to take enough action on this issue.

I am concerned that Ofcom, in collaboration with the Government, has managed to get us to a situation that makes nobody happy. It is not helpful for some of the tech companies. For example, category 1 is based solely on user numbers, which means that suicide forums, eating disorder platforms, doxing platforms and livestreaming platforms where self-generated child sexual abuse material is created are subject to exactly the same rules as a hill walking forum that gets three posts a week. In terms of proportionality, Ofcom is also failing the smallest platforms that are not risky, by requiring them to come to a three-day seminar on how to comply, when they might be run by a handful of volunteers spending a couple of hours a week looking after the forum and moderating every post. It will be very difficult for them to prove that children do not use their platforms, so there is no proportionality at either end of the spectrum.

In terms of where we are with the review, this is a very different Parliament from the one that began the conversations in the Joint Committee on the Draft Online Safety Bill. It felt like hardly anybody in these rooms knew anything about the online world or had any understanding of it. It is totally different now. There are so many MPs here who, for example, have an employment history of working hard to make improvements in this area. As the right hon. and learned Member said, we now have so much expertise in these rooms that we could act to ensure that the legislation worked properly. Rather than us constantly having to call these debates, the Government could rely on some of our expertise. They would not have to take on every one of a Joint Committee’s recommendations, for example, but they could rely on some of the expertise and the links that we have made over the years that we have been embedded in this area to help them make good decisions and ensure some level of safety by design.

Like so many Members in this place, I am concerned that the Act will not do what it is supposed to do. For me, the key thing was always keeping children safe online, whether that is about the commitments regularly given by the Government, which I wholeheartedly believe they wanted to fulfil, about hash matching to identify grooming behaviours, or about the doxing forums or suicide forums—those dark places of the internet—which will be subject to exactly the same rules as a hill walking forum. They are just going to fill in a risk assessment and say, “No children use our platform. There’s no risk on our platform, so it’s all good.” The Government had an opportunity to categorise them and they choose not to. I urge them to change their mind.

Draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025

Kirsty Blackman Excerpts
Tuesday 4th February 2025

(1 year ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Martin Wrigley Portrait Martin Wrigley (Newton Abbot) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Sir Christopher. I am disappointed in this statutory instrument. I recognise the Minister’s acknowledgment of the small sites, high-harm issue, but the issue is far more important and we are missing an opportunity here. Can the Minister set out why the regulations as drafted do not follow the will of Parliament, accepted by the previous Government and written into the Act, that thresholds for categorisation can be based on risk or size? That was a long-argued point that went through many iterations.

The then Minister accepted the amendment that was put forward and said:

“many in the House have steadfastly campaigned on the issue of small but risky platforms.” —[Official Report, 12 September 2023; Vol. 737, c. 806.]

He confirmed that the legislation would now give the Secretary of State the discretion to decide whether to set a threshold based on the number of users or the functionalities offered, or both factors, with the change ensuring that the framework was as flexible as possible in responding to the risk landscape. That has been thrown away in this new legislation. The Minister just said that we must do everything in our power, and yet the Government are throwing out a crucial change made to the Act to actually give them more power. They are getting rid of a power by changing this.

The amendment was to ensure that small sites dedicated to harm, such as sites providing information on suicide or self-harm or set up to target abuse and hatred at minority groups, like we saw in the riots in the summer, were subject to the fullest range of duties. When Ofcom published its advice, however, it disregarded this flexibility and advised that regulation should be laid bringing only the large platforms into category 1.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Is the hon. Member as concerned as I am that the Government seem to be ignoring the will of Parliament in their decision? Is he worried that young people particularly will suffer as a result?

Martin Wrigley Portrait Martin Wrigley
- Hansard - - - Excerpts

Absolutely—I am. The Secretary of State’s decision to proceed with this narrow interpretation of the Online Safety Act provisions, and the failure to use the power they have to reject Ofcom’s imperfect advice, will allow small, risky platforms to continue to operate without the most stringent regulatory restrictions available. That leaves significant numbers of vulnerable users—women and individuals from minority groups—at risk of serious harm from targeted activity on these platforms.

I will set a few more questions for the Minister. How do His Majesty’s Government intend to assess whether Ofcom’s regulatory approach to small but high-harm sites is proving effective, and have any details been provided on Ofcom’s schedule of research about such sites? What assessment have the Government made of the different harms occurring on small, high-harm platforms? Have they broken this down by type of harm, and will they make such information available? Have the Government received legal advice about the use of service disruption orders for small but high-harm sites? Do the Government expect Ofcom to take enforcement action against small but high-harm sites, and have they made an assessment of the likely timescales for enforcement action? Will the Government set out criteria against which they expect Ofcom to keep its approach to small but high-harm sites under continual review, as set out in their draft statement of strategic priorities for online safety?

Was the Minister aware of the previous Government’s commitment that Select Committees in both Houses would be given the opportunity to scrutinise draft Online Safety Act statutory instruments before they were laid? If she was, why did that not happen in this case? Will she put on record her assurances that Online Safety Act statutory instruments will in future be shared with the relevant Committees before they are laid?

For all those reasons, I will vote against the motion.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I appreciate the opportunity to speak in this Committee, Sir Christopher. Like at least one other Member in the room, I lived the Online Safety Bill for a significant number of months—in fact, it seemed to drag on for years.

As the Minister said, the Online Safety Act is long overdue. We have needed this legislation for 30 years, since I was a kid using the internet in the early ’90s. There has always been the risk of harm on online platforms, and there have always been places where people can be radicalised and can see misogynistic content or content that children should never be able to see. In this case, legislation has moved significantly slower than society—I completely agree with the Minister about that—but that is not a reason for accepting the statutory instrument or agreeing with the proposed threshold conditions.

On the threshold conditions, I am unclear as to why the Government have chosen 34 million and 7 million for the average monthly active users. Is it 34 million because Reddit happens to have 35 million average UK users—is that why they have taken that decision? I absolutely believe that Reddit should be in scope of category 1, and I am pretty sure that Reddit believes it should be in scope of category 1 and have those additional duties. Reddit is one of the places where the functionalities and content recommendation services mean that people, no matter what age they are, can see incredibly harmful content. They can also see content that can be incredibly funny—a number of brilliant places on Reddit allow people can look at pictures of cats, which is my favourite way to use the internet—but there are dark places in Reddit forums, where people can end up going down rabbit holes. I therefore agree that platforms such as Reddit should be in scope of category 1.

The Minister spoke about schedule 11 and the changes that were made during the passage of the Act. The Minister is absolutely right. Paragraph 1(5) of that schedule states:

“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

However, that does not undo the fact that we as legislators made a change to an earlier provision in that schedule. We fought for that incredibly hard and at every opportunity—in the Bill Committee, on the Floor of the House, in the recommitted Committee and in the House of Lords. At every stage, we voted for that change to be made, and significant numbers of outside organisations cared deeply about it. We wanted small high-risk platforms to be included. The provision that was added meant that the Secretary of State must make regulations relating to

“any other characteristics of that part of the service or factors relating to that part of the service that the Secretary of State considers relevant.”

That was what the Government were willing to give us. It was not the original amendment that I moved in Bill Committee, which was specifically about small high-risk platforms, but it was enough to cover what we wanted.

What functionalities could and should be brought in scope? I believe that any service that allows users to livestream should be in the scope of category 1. We know that livestreaming is where the biggest increase in self-generated child sexual abuse material is. We know that livestreaming is incredibly dangerous, as people who are desperate to get access to child sexual abuse material can convince vulnerable young people and children to livestream. There is no delay where that content can be looked at and checked in advance of it being put up, yet the Government do not believe that every service that allows six-year-olds to livestream should be within the scope of category 1. The Government do not believe that those services should be subject to those additional safety duties, despite the fact that section 1 of the Online Safety Act 2023 says platforms should be “safe by design”. However, this is not creating platforms that are safe by design.

The regulations do not exclude young people from the ability to stream explicit videos to anyone because they only include services with over 34 million users, or over 7 million when it comes to content recommendation, and I agree that services in those cases are problematic. However, there are other really problematic services, causing life-changing—or in some cases, life-ending—problems for children, young people and vulnerable adults that will not be in the scope of category 1.

Generally, I am not a big fan of a lot of things that the UK Government have done; I have been on my feet, in the Chamber, arguing against a significant number of those things. This is one of the things that makes me most angry, because the Government, by putting forward this secondary legislation, are legislating in opposition to the will and intention of the Houses of Parliament. I know that we cannot bind a future Government or House, but this is not what was intended or agreed and moved on, nor what Royal Assent was given on; that was on the basis that we had assurances from Government Ministers that they would look at those functionalities and small but high-risk platforms.

For what Ofcom has put out in guidance and information on what it is doing on small but high-risk platforms, why are we not using everything that is available? Why are Government not willing to use everything available to them to bring those very high-risk platforms into the scope of category 1?

The changes that category 1 services would be required to make include additional duties; for a start, they are under more scrutiny—which is to be expected—and they are put on a specific list of category 1 services which will be published. That list of category 1 services includes platforms such as 4chan, that some people may have never heard of. Responsible parents will see that list and say, “Hold on a second. Why is 4chan on there? I don’t want my children to be going on there. It is clearly not a ginormous platform, therefore it must be on there because it is a high-risk service.” Parents will look at that list and talk to their children about those platforms. In terms of the category 1 list, never mind the additional duties, that would have a positive impact. Putting suicide forums on that list of category 1 services would have a positive impact on the behaviour of parents, children, and the teachers who teach those young people how to access the internet safely.

I guarantee that a significant number of teachers and people that are involved with young people have never heard of 4chan, but putting it on that list would give them an additional tool to enable them to approach young people and talk about the ways in which they use the internet.

Danny Chambers Portrait Dr Danny Chambers (Winchester) (LD)
- Hansard - - - Excerpts

I thank the hon. Lady for speaking so passionately on this matter. As the Liberal Democrat mental health spokesperson, something that we are increasingly coming across is that it is not just adults asking children to livestream, but children, peer-to-peer, who do not realise that it is illegal. As the hon. Lady touched on, the mental health impact is huge but also lifelong. Someone can have a digital footprint that they can never get rid of, and children who are uninformed and uneducated to the impacts of their decisions could be affected decades into the future.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I completely agree. That is an additional reason why livestreaming is one of my biggest concerns. That functionality should have been included as a matter of course. Any of the organisations that deal with young people and the removal of child sexual abuse material online, such as the Internet Watch Foundation, will tell you that livestreaming is a huge concern. The hon. Member is 100% correct.

That is the way I talk to my children about online safety: once something is put online—once it is on the internet—it cannot ever be taken back. It is there forever, no matter what anyone does about it, and young people may not have the capacity to understand that. If systems were safe by design, young people simply would not have access to livestreaming at all; they would not have access to that functionality, so there would be that moment of thinking before they do something. They would not be able to do peer-to-peer livestreaming that can then be shared among the entire school and the entire world.

We know from research that a significant number of child sexual abuse materials are impossible to take down. Young people may put their own images online or somebody else may share them without their consent. Organisations such as the Internet Watch Foundation do everything they can to try to take down that content, but it is like playing whack-a-mole; it comes up and up and up. Once they have fallen into that trap, the content cannot be taken back. If we were being safe by design, we would ensure, as far as possible—as far as the Government could do, we could do or Ofcom could do—that no young person would be able to access that functionality. As I said, it should have been included.

I appreciate what the Government said about content recommendation and the algorithms that are used to ensure that people stay on platforms for a significant length of time. I do not know how many Members have spent much time on TikTok, but people can start watching videos of cats and still be there an hour and a half later. The algorithms are there to try to keep us on the platform. They are there because, actually, the platforms make money from our seeing the advertisements. They want us to see exciting content. Part of the issue with the content recommendation referenced in the conditions is that platforms are serving more and more exciting and extreme content to try to keep us there for longer, so we end up with people being radicalised on these platforms—possibly not intentionally by the platforms, but because their algorithm serves more and more extreme content.

I agree that that content should have the lower threshold in terms of the number of users. I am not sure about the numbers of the thresholds, but I think the Government have that differentiation correct, particularly on the addictive nature of algorithmic content. However, they are failing on incredibly high-risk content. The additional duties for category 1 services involve a number of different things: illegal content risk assessments, duties relating to terms of service, children’s risk assessments, adult empowerment duties and record-keeping duties. As I said, the fact that those category 1-ranked platforms will be on a list is powerful in itself, but adding those additional duties is really important.

Let us say that somebody is undertaking a risky business—piercing, for example. Even though not many people get piercings in the grand scheme of things, the Government require piercing organisations to jump through additional hoops because they are involved in dangerous things that carry a risk of infection and other associated risks. They are required to meet hygiene regulations, register with environmental health and have checks of their records to ensure that they know who is being provided with piercings, because it is a risky thing. The Government are putting additional duties on them because they recognise that piercing is risky and potentially harmful.

However, the Government are choosing not to put additional duties on incredibly high-risk platforms. They are choosing not to do that. They have been given the right to do that. Parliament has made its will very clear: “We want the Government to take action over those small high-risk platforms.” I do not care how many hoops 4chan has to jump through. Give it as many hoops as possible; it is an incredibly harmful site, and there are many others out there—hon. Members mentioned suicide forums, for example. Make them jump through every single hoop. If we cannot ban them outright—which would be my preferred option—make them keep records, make them have adult-empowerment duties, and put them on a list of organisations that we, the Government or Ofcom reckon are harmful.

If we end up in a situation where, due to the failures of this Act, young people commit suicide, and the platform is not categorised properly, there is then a reduction in the amount of protections, and in the information that they have to provide about deceased children to the families, because they are not categorised as category 1 or 2B. We could end up in a situation where a young person dies as a result of being radicalised on a forum—because the Government decided it should not be in scope—but that platform does not even have to provide the deceased child’s family with access to that online usage. That is shocking, right? If the Government are not willing to take the proper action required, at least bring these platforms into the scope of the actions and requirements related to deceased children.

I appreciate that I have taken a significant length of time—although not nearly as long as the Online Safety Act has taken to pass, I hasten to say—but I am absolutely serious about the fact that I am really, really angry about this. This is endangering children. This is endangering young people. This is turning the Online Safety Act back into what some people suggested it should be at the beginning, an anti-Facebook and anti-Twitter Act, or a regulation of Facebook and Twitter— or X—Act, rather than something that genuinely creates what it says in section 1 of the Act: an online world that is “safe by design”.

This is not creating an online world that is safe by design; this is opening young people and vulnerable adults up to far more risks than it should. The Government are wilfully making this choice, and we are giving them the opportunity to undo this and to choose to make the right decision—the decision that Parliament has asked them to make—to include functionalities such as livestreaming, and to include those high-risk platforms that we know radicalise people and put them at a higher risk of death.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I thank all Members for their very powerful contributions to the debate. This instrument will bring us one step closer to a safer online world for our citizens. It is clearer than ever that it is desperately needed: transparency, accountability and user empowerment matter now more than ever.

The Opposition spokesperson, the hon. Member for Huntingdon, asked whether we agree on the need for companies not to wait for the duties in the Act to be implemented, but to ensure that safety is baked in from the start. I absolutely agree, and he will be aware that the Secretary of State has made that point on many occasions. He also raised the issue of proportionality. I confirm that many of the duties on categorised services are subject to the principle of proportionality, which requires Ofcom to consider measures that are technically feasible to providers of a certain size or capacity, and in some cases duties are based on the assessment of risk of harm presented by the service.

For example, in determining what is proportionate for the user empowerment duties on content for category 1 services, the findings of the most recent user empowerment assessments are relevant. They include the incidence of relevant content on the service in addition to the size and capacity of the provider. Where a code of practice is relevant to a duty, Ofcom must have regard to the principles on proportionality, and what is proportionate for one kind of service might not be for another.

The hon. Member for Huntingdon is absolutely right that the pornography review has been completed. The Government are reviewing that at the moment and will publish it in due course.

In response to the hon. Members for Newton Abbot and for Aberdeen North (Kirsty Blackman) and to the right hon. and learned Member for Kenilworth and Southam (Sir Jeremy Wright), when the Online Safety Act was introduced, category 1 thresholds were due to be assessed based on the level of risk and harm for adults—as the Members read out very clearly. That was removed during the passage of the Bill by the previous Government.

As things stand, although Baroness Morgan’s successful amendment made it possible for threshold conditions to be based solely on functionalities, it did not change the basis of Ofcom’s research, which for category 1 is easy, quick and wide dissemination of content. The Secretary of State had to consider that. I will repeat that for all Members to hear again: the Secretary of State has to act within the powers given to him in schedule 11 when setting out the threshold and conditions. The powers do not allow for thresholds to be determined by another body, as per the amendment.

Although the hon. Member for Aberdeen North very powerfully read out the Act, it very clearly sets out that it does not actually do what she is asking for it to do. We absolutely agree that small but risky sites need to be covered, but as it stands, the Secretary of State does not have the powers to include them.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Will the Minister give way?

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

Sorry, I have lots of points to cover. If I have not covered the hon Member’s concerns in my response, she is more than welcome to intervene later.

These small but risky services are of significant concern to the Government, and they will still have to protect against illegal content and, where relevant, content that is harmful to children. Ofcom also has a dedicated taskforce to go after them. I hope that answers the hon. Member’s question.

The hon. Member for Newton Abbot also raised the review of Ofcom’s approach. The regulator has already trialled an approach of targeting small but risky services through its regulation of video-sharing platforms. Indeed, a number of those services improved their policies and content moderation in response. All the adult platforms under the VSP regime, large and small, have implemented age verification through this route to ensure that under-18s cannot access pornography on their services. In instances where services fail to make necessary changes, they will face formal enforcement action from Ofcom. Ofcom has a proven track record and the Government have every faith in its ability to take action against non-compliant services.

The hon. Member also raised issues around how Ofcom will enforce action against small but risky services. Ofcom will have robust enforcement powers available to use against companies that fail to fulfil their duties and it will be able to issue enforcement decisions. Action can include fines of up to £18 million or 10% of qualifying worldwide revenue in the relevant year, whichever is higher, and Ofcom can direct companies to take specific steps to comply with its regulation.

--- Later in debate ---
Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

I am going to make some progress. On livestreaming, Ofcom considered that functionality, but concluded that the key functionalities that spread content easily, quickly and widely are content recommender systems and forwarding or resharing user-generated content.

Services accessed by children must still be safe by design, regardless of whether they are categorised. Small but risky services will also still be required to comply with illegal content duties. The hon. Member for Aberdeen North should be well aware of that as she raised concerns on that issue.

On child safety, there were questions about how online safety protects children from harmful content. The Act requires all services in scope to proactively remove and prevent users from being exposed to priority illegal content, such as illegal suicide content and child sexual exploitation and abuse material. That is already within the remit.

In addition, companies that are likely to be accessed by children will need to take steps to protect children from harmful content and behaviour on their services, including content that is legal but none the less presents a risk of harm to children. The Act designates content that promotes suicide or self-harm as in the category of primary priority content that is harmful to children. Parents and children will also be able to report pro-suicide or pro-self-harm content to the platform and the reporting mechanism will need to be easy to navigate for child users. On 8 May, Ofcom published its draft children’s safety codes of conduct, in which it proposed measures that companies should employ to protect children from suicide and self-harm content, as well as other content.

Finally, on why category 1 is not based on risk, such as the risk of hate speech, when the Act was introduced, category 1 thresholds were due to be assessed on the level of risk of harm to adults from priority content disseminated by means of that service. As I said earlier, that was removed during the Act’s passage by the then Government and replaced with consideration of the likely functionalities and how easily, quickly and widely user-generated content is disseminated, which is a significant change. Although the Government understand that that approach has its critics, who argue that the risk of harm is the most significant factor, that is the position under the Act.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

The Minister is making the case that the Secretary of State’s hands are tied by the Act —that it requires stuff in relation to the number of users. Can she tell us in which part of the Act it says that, because it does not say that? If she can tell us where it is in the Act, I am quite willing to sit down and shut up about this point, but it is not in the Act.

Feryal Clark Portrait Feryal Clark
- Hansard - - - Excerpts

The legislation allows the Secretary of State to deviate from Ofcom’s advice and to publish a statement explaining why. However, the core consideration for category 1 under schedule 11 is—I repeat for the third time—how easily, quickly and widely regulated user-generated content is disseminated by means of a service. As a result, for category 1, Ofcom concluded that the content is disseminated with increased breadth as the number of users increases.

The decision to proceed with the threshold combination recommended by Ofcom, rather than discounting user-number thresholds, reflects that any threshold condition created by the Government should consider the factors as set out in the Act, including easy, quick and wide dissemination for category 1, and the evidence base. That is what the Act says. As a result, the Government decided to not proceed with an approach that deviated from Ofcom’s recommendation, particularly considering the risk of unintended consequences.

I am more than happy to write to the hon. Member for Aberdeen North with the full details. I understand that she feels very passionately about this point, but the Act is the Act. Although I am grateful for her contribution, I have to follow what the Act says, based on the legal advice that I get.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

Thank you, Sir Christopher—I appreciate that prod. I did look at Standing Orders this morning, but could not find that bit, so that is incredibly helpful.

On what the Minister said about schedule 11 and the notes that she has been passed from her team on that point, I appreciate her commitment to share the Government’s legal advice. That will be incredibly helpful; it would have been helpful to have it in advance of this Committee.

In schedule 11, it says:

“In making regulations under sub-paragraph (1), the Secretary of State must take into account the likely impact of the number of users of the user-to-user part of the service, and its functionalities, on how easily, quickly and widely regulated user-generated content is disseminated by means of the service.”

Perhaps I cannot read English, or perhaps the Minister, her legal advisers and the team at DSIT read it in a different way from me, but the Secretary of State having to take something into account and the Secretary of State being bound by something are two different things—they are not the same. It does not say that the Secretary of State must regulate only on the specific number of users.

In fact, schedule 11 says earlier that the Secretary of State

“must make regulations specifying conditions…for the user-to-user part of regulated user-to-user services relating to each of the following”,

which are the

“number of users…functionalities of that part of the service, and…any other characteristics of that part of the service or factors”.

The Secretary of State must therefore make regulations in relation to any other characteristics of that part of the service or factors

“relating to that part of the service that the Secretary of State considers relevant.”

He must do that, but he must only take into account the number of users. The Government, however, have decided that taking into account is much more important than “must” do something. They have decided to do that despite Parliament being pretty clear in the language it has used.

I am not terribly happy with the Online Safety Act. It is a lot better than the situation we have currently, but it is far from perfect. As the Minister said, I argued in favour of keeping the stuff about legal but harmful content for adults. I argued against the then Government’s position on that, but the Act is the Act that we have.

The Minister’s point does not make sense. The Secretary of State has to take into account the number of users and how quickly things are disseminated, but he must make regulations about functionalities or factors that he considers relevant. Therefore, it seems that he does not consider suicide forums and livestreaming to be relevant; if he did, he would surely be bound by the “must” and would have to make regulations about them. It is frustrating that the Act does not do what it is supposed to do and does not protect young people from livestreaming. The Minister said that it protects people from seeing that illegal content, but it does not prevent them from creating it.

The Government could make regulations so that every platform that has a livestreaming functionality, or even every platform that has child users on it—there is a lot in the Act about the proportion of children who use a service—is automatically included in category 1 because they consider them to be high risk.

Jeremy Wright Portrait Sir Jeremy Wright
- Hansard - - - Excerpts

It would not be right for either of us to ask the Minister to disclose legal advice—that clearly would not be appropriate—but I am grateful for the Minister’s offer to share a slightly more expansive description of why the Government have come to the conclusion that they have.

On the hon. Lady’s point about what the Act actually says, we have both quoted paragraph 1(5) of schedule 11, which deals with whether the language that has found its way into the ministerial statement is the be-all and end-all of the Minister’s conclusions. We both think it is not. If it is the case, as I think the Minister is arguing, that the ability to disseminate “easily, quickly and widely” is essentially a synonym for the scale of the service and the number of its users, what does the hon. Lady think of the amendment that Baroness Morgan made in the other place to paragraph 1(4), which says that when the regulations we are considering specify

“the way or ways in which the relevant conditions are met”,

for category 1 threshold conditions

“at least one specified condition about number of users or functionality must be met”?

The crucial word that was added is “or”. If the number of users were required to establish what the hon. Lady has described, the word “or” would be inappropriate.

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I absolutely agree, and that is a helpful clarification.

If the Government have decided that it is too difficult to regulate high-risk platforms as category 1, and that they do not matter enough because they do not have enough of an impact, they should stand up and tell us that. Rather than saying that their hands have been tied by the Act—they manifestly have not—they need to take ownership of their actions. If they have decided that such platforms are not important enough or that they cannot be bothered having a fight with Ofcom about that, they should be honest and say, “This is the position we have decided to take.” Instead, they are standing up and saying, “Our hands have been tied,” but that is just not correct: their hands have not been tied by the Act.

I appreciate that the Minister will get in touch with me about the legal advice, but it will be too late. This statutory instrument will have been through the process by that time, and people will have been put at risk as a result of the Government’s failure. They have the power to take action in relation to functionalities and factors, and in relation to suicide forums, livestreaming and the creation of child sexual abuse material, and they are choosing not to.

If the Government have decided that it is too difficult to do that, that those platforms are not risky enough and that not enough people are being harmed by them, they need to hold their hands up and say, “We’ve decided that this is the position we are going to take.” They must not hide behind the legislation, which does not say what they are telling us it says. They should just be honest about the fact that they have decided that they cannot be bothered to take action. They cannot be bothered to have a fight with Ofcom because it is not important enough. Hiding behind the legislation is incredibly cowardly—it does not say that.

AstraZeneca

Kirsty Blackman Excerpts
Monday 3rd February 2025

(1 year ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts

Urgent Questions are proposed each morning by backbench MPs, and up to two may be selected each day by the Speaker. Chosen Urgent Questions are announced 30 minutes before Parliament sits each day.

Each Urgent Question requires a Government Minister to give a response on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

Not only have we set aside £520 million precisely to be able to invest in the life sciences industry with an innovation fund, we are very keen to work with specific businesses to understand how they can make more secure, long-term investment. The single most important thing for most people making an investment in the UK is whether they believe there is political, fiscal and financial stability in the UK. That is what we are absolutely determined to deliver. My hon. Friend makes a very good point about those who are immunosuppressed for all sorts of different reasons, whether their medication or a condition. I will take that point back to the Department.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

The Chancellor said that economic growth is the most important thing and this was an opportunity to get some of that economic growth. This was an opportunity to get something over the line and the UK Government failed to deliver it. How can the House and the public trust anything the UK Government say? How can they say that this is the founding mission if they then fail to deliver for a region that could really do with that economic growth?

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

The thing is that spending taxpayers’ money has to be proven to be good value for money. That is why, whenever we are making an investment such as this, we have to make sure it delivers more return on investment than £1 for £1. When AstraZeneca made the decision to cut the R&D part of its budget from £150 million to £90 million, it made sense for the UK Government to look again at the amount of money we could legitimately put in on behalf of the taxpayer. If the hon. Lady had been in my place, I think she would have made exactly the same decision.

Online Safety: Children and Young People

Kirsty Blackman Excerpts
Tuesday 26th November 2024

(1 year, 3 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank my hon. Friend for raising the really important—indeed, deeply concerning—issue of the rise of anti-women hate, with the perpetrators marketing themselves as successful men.

What we are seeing is that boys look at such videos and do not agree with everything that is said, but little nuggets make sense to them. For me, it is about the relentless bombardment: if someone sees one video like that, they might think, “Oh right,” and not look at it properly, but they are relentlessly targeted by the same messaging over and over again.

That is true not just for misogynistic hate speech, but for body image material. Girls and boys are seeing unrealistic expectations of body image, which are often completely fake and contain fake messaging, but which make them reflect on their own bodies in a negative way, when they may not have had those thoughts before.

I want to drive home that being 14 years old is tough. I am really old now compared with being 14, but I can truly say to anybody who is aged 14 watching this: “It gets better!” It is hard to be a 14-year-old: they are exploring their body and exploring new challenges. Their hormones are going wild and their peers are going through exactly the same thing. It is tough, and school is tough. It is natural for children and young people to question their identity, their role in the world, their sexuality, or whatever it is they might be exploring—that is normal—but I am concerned that that bombardment of unhealthy, unregulated and toxic messaging at a crucial time, when teenagers’ brains are developing, is frankly leading to a crisis.

I return to an earlier point about whether the parts of apps or platforms that children are using are actually safe for them to use. There are different parts of apps that we all use—we may not all be tech-savvy, but we do use them—but when we drill into them and take a minute to ask, “Is this safe for children?”, the answer for me is, “No.”

There are features such as the live location functionality, which comes up a lot on apps, such as when someone is using a maps app and it asks for their live location so they can see how to get from A to B. That is totally fine, but there are certain social media apps that children use that have their live location on permanently. They can toggle it to turn it off, but when I asked children in Darlington why they did not turn it off, they said there is a peer pressure to keep it on—it is seen as really uncool to turn it off. It is also about being able to see whether someone has read a message or not.

I then said to those children, “Okay, but those apps are safe because you only accept people you know,” and they said, “Oh no, I’ve got thousands and thousands of people on that app, and it takes me ages to remove each person, because I can’t remember if I know them, so I don’t do it.” They just leave their location on for thousands of people, many of whom may be void accounts, and they do not even know if they are active any more. The point is that we would not allow our children to go into a space where their location was shown to lots of strangers all the time. Those children who I spoke to also said that the live location feature on some of these apps is leading to in-person bullying and attacks. That is absolutely horrifying.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

On that point, is the hon. Member aware that if someone toggles their location off on Snapchat, for example, it constantly—in fact, every time the app is opened—says, “You’re on ghost mode. Do you want to turn your location back on?” So every single time someone opens the app, it tries to convince them to turn their location back on.

Lola McEvoy Portrait Lola McEvoy
- Hansard - - - Excerpts

I thank the hon. Member for raising that issue, because there are lots of different nudge notifications. We can understand why, because it is an unregulated space and the app is trying to get as much data as possible—if we are not paying for the service, we are the service. We all know that as adults, but the young people and children who we are talking about today do not know that their data is what makes them attractive to that app.

--- Later in debate ---
Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

I could talk for hours on this subject, Mr Dowd, but, do not worry, I will not. There are a number of things that I would like to say. Not many Members present sat through the majority of the Online Safety Bill Committee as it went through Parliament, but I was in every one of those meetings, listening to various views and debating online safety.

I will touch on one issue that the hon. Member for Darlington (Lola McEvoy) raised in her excellent and important speech. I agree with almost everything she said. Not many people in Parliament have her level of passion or knowledge about the subject, so I appreciate her bringing forward the debate.

On the issue of features, I totally agree with the hon. Member and I moved an amendment to that effect during the Bill’s progress. There should be restrictions on the features that children should be able to access. She was talking about safety by design, so that children do not have to see content that they cannot unsee, do not have to experience the issues that they cannot un-experience, cannot be contacted by external people who they do not know, and cannot livestream. We have seen an increase in the amount of self-generated child sexual abuse material and livestreaming is a massive proportion of that.

Yesterday, a local organisation in Aberdeen called CyberSafe Scotland launched a report on its work in 10 of our primary schools with 1,300 children aged between 10 and 12—primary school children, not secondary school children. Some 300 of those children wrote what is called a “name it”, where they named a problem that they had seen online. Last night, we were able to read some of the issues that they had raised. Pervasive misogyny is everywhere online, and it is normalised. It is not just in some of the videos that they see and it is not just about the Andrew Tates of this world—it is absolutely everywhere. A couple of years ago there was a trend in online videos of young men asking girls to behave like slaves, and that was all over the place.

Children are seeing a different online world from the one that we experience because they have different algorithms and have different things pushed at them. They are playing Roblox and Fortnite, but most of us are not playing those games. I am still concerned that the Online Safety Act does not adequately cover all of the online gaming world, which is where children are spending a significant proportion of their time online.

A huge amount more needs to be done to ensure that children are safe online. There is not enough in place about reviewing the online safety legislation, which Members on both sides of the House pushed for to ensure that the legislation is kept as up to date as possible. The online world changes very rapidly: the scams that were happening nine months ago are totally different from those happening today. I am still concerned that the Act focuses too much on the regulation of Facebook, for example, rather than the regulation of the online world that our children actually experience. CyberSafe Scotland intentionally centred the views and rights of young people in its work, which meant that the programmes that it delivered in schools were much more appropriate and children were much better able to listen and react to them.

The last thing that I will mention is Girlguiding and its girls’ attitude survey. It is published on an annual basis and shows a huge increase in the number of girls who feel unsafe. That is because of the online world they are experiencing. We have a huge amount of responsibility here, and I appreciate the hon. Member for Darlington bringing the debate forward today.

Peter Dowd Portrait Peter Dowd (in the Chair)
- Hansard - - - Excerpts

I will keep this to an informal four-minute limit. Regrettably, if Members speak beyond that, I will have to introduce a formal figure.

AI Seoul Summit

Kirsty Blackman Excerpts
Thursday 23rd May 2024

(1 year, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Saqib Bhatti Portrait Saqib Bhatti
- View Speech - Hansard - - - Excerpts

I completely agree with my right hon. Friend. We recognise the risks and opportunities that AI presents. That is why we have tried to balance safety and innovation. I refer him to the Online Safety Act 2023, which is a technology agnostic piece of legislation. AI is covered by a range of spheres where the Act looks at illegal harms, so to speak. He is right to say that this is about helping humanity to move forward. It is absolutely right that we should be conscious of the risks, but I am also keen to support our start-ups, our innovative companies and our exciting tech economy to do what they do best and move society forward. That is why we have taken this pro-safety, pro-innovation approach; I repeat that safety in this field is an enabler of growth.

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- View Speech - Hansard - -

I would like to thank Sir Roger Gale, who has just left the Chair. He has been excellent in the Chair and I have very much enjoyed his company as well as his chairing.

I thank the Government for advance sight of the statement. My constituents and people across these islands are concerned about the increasing use of AI, not least because of the lack of regulation in place around it. I have specific questions in relation to the declarations and what is potentially coming down the line with regulation.

Who will own the data that is gathered? Who has responsibility for ensuring its safety? What is the Minister doing to ensure that regard is given to copyright and that intellectual property is protected for those people who have spent their time and energy and massive talents in creating information, research and artwork? What are the impacts of the use of AI on climate change? For example, it has been made clear that using this technology has an impact on the climate because of the intensive amounts of electricity that it uses. Are the Government considering that?

Will the Minister ensure that in any regulations that come forward there is a specific mention of AI harms for women and girls, particularly when it comes to deepfakes, and that they and other groups protected by the Equality Act 2010 are explicitly mentioned in any regulations or laws that come forward around AI? Lastly, we waited 30 years for an Online Safety Act. It took a very long time for us to get to the point of having regulation for online safety. Can the Minister make a commitment today that we will not have to wait so long for regulations, rather than declarations, in relation to AI?

Saqib Bhatti Portrait Saqib Bhatti
- View Speech - Hansard - - - Excerpts

The hon. Lady makes some interesting points. The thing about AI is not just the large language models, but the speed and power of the computer systems and the processing power behind them. She talks about climate change and other significant issues we face as humanity; that power to compute will be hugely important in predicting how climate change evolves and weather systems change. I am confident that AI will play a huge part in that.

AI does not recognise borders. That is why the international collaboration and these summits are so important. In Bletchley we had 28 countries, plus the European Union, sign the declaration. We had really good attendance at the Seoul summit as well, with some really world-leading declarations that will absolutely be important.

I refer the hon. Lady to my earlier comments around copyright. I recognise the issue is important because it is core to building trust in AI, and we will look at that. She will understand that I will not be making a commitment at the Dispatch Box today, for a number of reasons, but I am confident that we will get there. That is why our approach in the White Paper response has been well received by the tech industry and AI.

The hon. Lady started with a point about how constituents across the United Kingdom are worried about AI. That is why we all have to show leadership and reassure people that we are making advances on AI and doing it safely. That is why our AI Safety Institute was so important, and why the network of AI safety institutes that we have helped to advise on and worked with other countries on will be so important. In different countries there will be nuances regarding large language models and different things that they will be approaching—and sheer capability will be a huge factor.

Smartphones and Social Media: Children

Kirsty Blackman Excerpts
Tuesday 14th May 2024

(1 year, 9 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Kirsty Blackman Portrait Kirsty Blackman (Aberdeen North) (SNP)
- Hansard - -

Thank you for chairing this debate, Sir George. I congratulate the hon. Member for Penistone and Stocksbridge (Miriam Cates) on securing it. I want to talk about a number of things: safety online by design, the safety of devices by design, and parental and child education. Just to confuse everyone, I will do that in reverse order, starting off with parental and child education.

Ofcom has a media literacy strategy consultation on the go just now, as well as the consultation on the strategy around protecting children. Both are incredibly important. We have a massive parental knowledge gap. In about 15 or 20 years, this will not be nearly so much of a problem, because parents then will have grown up online. I am from the first generation of parents who grew up online. My kids are 10 and 13. I got the internet at home when I was six or seven, although not in the way that my kids did. Hardly anybody in this House grew up on the internet, and hardly any of the parents of my children’s peers grew up online.

I know very well the dangers there are online, and I am able to talk to my children about them. I have the privilege, the ability and the time to ensure that I know everything about everything they are doing online—whether that means knowing the ins and outs of how Fortnite works, or how any of the games they are playing online work, I am lucky enough to be able to do that. Some parents have neither the time nor the energy nor the capacity to do that.

Miriam Cates Portrait Miriam Cates
- Hansard - - - Excerpts

I commend the hon. Lady for her knowledge and dedication, but is it not the case that even parents as diligent as her find that teenagers can bypass these controls? Even if our children do not have access to a device, they can easily be shown the most harmful of material on the school bus. Is this not actually about child development, and whether a child has the brain development to be able to use these devices safely, rather than just about education?

Kirsty Blackman Portrait Kirsty Blackman
- Hansard - -

I wanted to talk about education among a number of other things. Children can absolutely be shown things on the bus, and stuff like that; children and young people will do what they can to subvert their parents’ authority in all sorts of ways, not just when it comes to mobile phones. Part of the point I was making is that I have the privilege of being able to take those actions, while parents who are working two jobs and are really busy dealing with many other children, for example, may not have the time to do so. We cannot put it all on to parental education, but we cannot put it all on to the education of the children, either. We know that however much information we give children or young people—particularly teenagers—they are still going to make really stupid decisions a lot of the time. I know I made plenty of stupid decisions as a teenager, and I am fairly sure that my children will do exactly the same.

I grew up using message boards, which have now been taken over by Reddit, and MSN Messenger, while kids now use Facebook Messenger or WhatsApp. I grew up using internet relay chat—IRC—and Yahoo! Chat, which have taken over by Discord, and playing Counter-Strike, which has now been subsumed by Fortnite. I used Myspace and Bebo, while kids now use things like Instagram. These things have been around for a very long time. We have needed an online safety Act for more than 20 years. When I was using these things in the ’90s, I was subject to the same issues that my children and other children are seeing today. Just because it was not so widespread does not mean it was not happening, because it absolutely was.

The issue with the Online Safety Act is that it came far too late—I am glad that we have it, but it should have been here 20 years ago. It also does not go far enough; it does not cover all the things that we need it to cover. During the passage of the Act, we talked at length about things like livestreaming, and how children should not be allowed to livestream at all under any circumstance. We could have just banned children from livestreaming and said that all platforms should not allow children to livestream because of the massive increase in the amount of self-generated child sexual abuse images, but the Government chose not to do that.

We have to have safety by design in these apps. We have to ensure that Ofcom is given the powers—which, even with the Online Safety Act, it does not have—to stop platforms allowing these things to happen and effectively ban children from accessing them. Effective age assurance would address some of the problems that the hon. Member for Penistone and Stocksbridge raises. Of course, children will absolutely still try to go around these things, but having that age assurance and age gating, as far as we possibly can—for example, the stuff that Ofcom is doing around pornographic content—will mean that children will not be able to access that content. I do not see that there should be any way for any child to access pornographic content once the Online Safety Act fully comes in, and once Ofcom has the full powers and ability to do that.

The other issue with the Online Safety Act is that it is too slow. There are a lot of consultation procedures and lead-in times. It should have come in far quicker, and then we would have had this protection earlier for our children and young people.

We need to have the safety of devices by design. I am slightly concerned about the number of children who are not lucky enough to get a brand-new phone; the right hon. Member for Chelmsford (Vicky Ford) talked about passing on a phone to a child. Covering that is essential if we are to have safety of devices by design. Online app stores are not effectively covered or as effectively covered as they should be, particularly when it comes to age ratings. I spoke to representatives of an online dating app, who said that they want their app to be 18-plus, but that one of the stores has rated it as 16-plus and they keep asking the store to change it and the store keeps refusing. It is totally ridiculous that we are in that situation. The regulation of app stores is really important, especially when parents will use the app store’s age rating; they will assume that the rating put forward by the app store is roughly correct. We need to make changes in that respect and we need to come down on the app stores, because they are so incredibly powerful. That is a real moment when parents, if they have parental controls, have that ability to make the changes.

In relation to safety online by design, I have already spoken about live streaming. When it comes to gaming, it is entirely possible for children to play online games without using chat functions. Lots of online games do not actually have any chat function at all. Children can play Minecraft without having any chat; they cannot play Roblox without having any effective access to chat. Parents need to understand the difference between Minecraft and Roblox—and not allow anyone to play Roblox, because it is awful.

There are decisions that need to be taken in relation to safety online by design. If people have effective age verification and an effective understanding of the audience for each of these apps and online settings, they can ensure that the rules are in place. I am not convinced yet that Ofcom has enough powers to say what is and what is not safe for children online. I am not convinced that even with the Online Safety Act, there is the flexibility for it to say, “Right—if you have done your child access assessment and you think that your app is likely to be used by children, you cannot have live streaming on the app.” I am not convinced that it has enough teeth to be able to take that action. It does when it comes to illegal content, but when it comes to things that are harmful for children but legal for adults, there is not quite enough strength for the regulator.

I will keep doing what I have been doing in this House, which is saying that the online world can be brilliant—it can be great. Kids can have a brilliant time playing online. They can speak to their friends; particularly if children are isolated or lonely, there are places where they can find fellowship and other people who are feeling the same way. That can be a positive thing. The hon. Member for Penistone and Stocksbridge has laid out where often the online world is negative, but it can be positive too. There are so many benefits in terms of schoolwork, clubs, accessing friends, and calendars. Cameras are great, too. My children sometimes use the bird app on their phones to work out which birds are singing. It is brilliant that they can do things like that online.

There are so many benefits, but we have a responsibility, just as we do when our children are playing in the park, to ensure that they are safe. We have a responsibility as legislators to ensure that the regulators have enough teeth to make sure that the online world is safe, so that children can get the benefits of the online world and of using smartphones but are not subject to the extremely negative outcomes. My hon. Friend the Member for Stirling (Alyn Smith) mentioned his constituent and the awful loss experienced by their family. Children should never, ever have to face that situation online, and we have a responsibility to regulate to ensure that they never have to.