Online Safety Bill Debate
Full Debate: Read Full DebateKirsty Blackman
Main Page: Kirsty Blackman (Scottish National Party - Aberdeen North)Department Debates - View all Kirsty Blackman's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberI call the SNP spokesperson, Kirsty Blackman.
I congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.
Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.
Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.
If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.
Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.
New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.
Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.
We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.
I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.
Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.
We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.
I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.
I want to say in passing that I support amendments 52 and 53, which stand in the name of my hon. Friend the Member for Stroud (Siobhan Baillie) and others. She will explain them fully so I do not need to, but they seem to be sensible clarifications that I hope the Government will consider favourably.
On that specific point, does the hon. Lady realise that the empowerment duties in respect of verified and non-verified users apply only to adult users? Children will not have the option to toggle off unverified users, because the user empowerment duties do not allow that to happen.
The evidence we have received is that it is parents who need the powers. I want to normalise the ability to turn off anonymised accounts. I think we will see children do that very naturally. We should also try to persuade their parents to take those stances and to have those conversations in the home. I obviously need to take up the matter with the hon. Lady and think carefully about it as matters proceed through the other place.
We know that parents are very scared about what their children see online. I welcome what the Minister is trying to do with the Bill and I welcome the legislation and the openness to change it. These days, we are all called rebels whenever we do anything to improve legislation, but the reality is that that is our job. We are sending this legislation to the other House in a better shape.
It has taken a while to get to this point; there have been hours and hours of scrutiny and so much time has been spent by campaigners and external organisations. I have received more correspondence on this Bill from people who really know what they are talking about than on any other I have worked on during my time in the House. I specifically thank the NSPCC and the Mental Health Foundation, which have provided me with a lot of information and advice about the amendments that we have tabled.
The internet is wonderful, exciting and incredibly useful, but it is also harmful, damaging and scary. The Bill is about trying to make it less harmful, damaging and scary while allowing people to still experience the wonderful, exciting and useful parts of it. The SNP will not vote against the Bill on Third Reading, but it would be remiss of me not to mention the issues that we still have with it.
I am concerned that the Government keep saying Children’s “Commissioner” when there are a number of Children’s Commissioners, and it is the Children’s Commissioner for England who has been added as a consultee, not the other ones. That is the decision that they have made, but they need to be clear when they are talking about it.
On protecting children, I am still concerned that there are issues on which the Bill is a little bit too social media-centric and does not necessarily take into account some of the ways that children generally interact with the internet, such as talking to their friends on Fortnite, talking to people they do not know on Fortnite and talking to people on Roblox. Things that are not caught by social media and things that are different are not covered by this as well as I would like. I am concerned that there is less an ability for children not to take part in risky features—to switch off private messaging and livestreaming, for example—than there is just to switch off types of content or features.
Lastly, on the changes that have been made, I do not know what people want to say that they felt they could not say as a result of the previous version of the Bill. I do not know why the Government feel it is okay to say, “Right, we’re concerned about ‘legal but harmful’, because we want people to be able to promote eating disorder content or because we want people to be able to promote self-harm content.” I am sure they do not—I am sure no Ministers want people to be able to promote that—so why have they made this change? Not one person has been able to tell me what they believe they would not be able to say under the previous iteration of the Bill. I have not had one person be able to say that. Ministers can just say “free speech” however much they like, but it does not mean anything if they cannot provide examples of what exactly it is that they believe somebody should be able to say that they could not under the previous iteration of the Bill.
I am glad that we have a Bill and I am glad to hear that a future Labour Government might change the legislation to make it better. I hope this will provide a safer environment for children online, and I hope we can get the Bill implemented as soon as it is through the Lords.