Online Safety Bill Debate
Full Debate: Read Full DebateJeremy Wright
Main Page: Jeremy Wright (Conservative - Kenilworth and Southam)Department Debates - View all Jeremy Wright's debates with the Department for Digital, Culture, Media & Sport
(1 year, 11 months ago)
Commons ChamberI congratulate the hon. Member for Gosport (Dame Caroline Dinenage) on what was one of the best speeches on this Bill—and we have heard quite a lot. It was excellent and very thoughtful. I will speak to a number of amendments. I will not cover the Labour amendments in any detail because, as ever, the Labour Front Benchers did an excellent job of that. The right hon. Member for Barking (Dame Margaret Hodge) covered nicely the amendment on liability, and brought up the issue of hate, particularly when pointed towards the Jewish community. I thank her for consistently bringing that up. It is important to hear her voice and others on this issue.
Amendment 43 was tabled by me and my hon. Friend the Member for Ochil and South Perthshire (John Nicolson) and it regards a default toggle for material that we all agree is unsafe or harmful. The Labour party has said that it agrees with the amendment, and the SNP believes that the safest option should be the default option. We should start from a point of view that if anybody wants to see eating disorder content, or racist or incredibly harmful content that does not meet the bar of illegality, they should have to opt in to receive it. They should not see it by default; they should have to make that choice to see such content.
Freedom of speech is written into the Bill. People can say whatever they want as long as it is below that bar of illegality, but we should not have to read it. We should not have to read abuse that is pointed toward minority groups. We should start from the position of having the safest option on. We are trying to improve the permissive approach that the Government have arrived at, and this simple change is not controversial. It would require users to flip a switch if they want to opt in to some of the worst and most dangerous content available online, including pro-suicide, pro-anorexia or pro-bulimia content, rather than leaving that switch on by default.
If the Government want the terms and conditions to be the place where things are excluded or included, I think platforms should have to say, “We are happy to have pro-bulimia or pro-anorexia content.” They should have to make that clear and explicit in their terms of service, rather than having to say, “We do not allow x, y and z.” They should have to be clear, up front and honest with people, because then people would know what they are signing up to when they sign up to a website.
Amendment 44 is on habit forming features, and we have not spoken enough about the habit forming nature of social media in particular. Sites such as TikTok, Instagram and Facebook are set up to encourage people to spend time on them. They make money by encouraging people to spend as much time on them as possible—that is the intention behind them. We know that 42% of respondents to a survey by YoungMinds reported displaying signs of addiction-like behaviour when questioned about their social media habits. Young people are worried about that, and they do not necessarily have the tools to avoid it. We therefore tabled amendment 44 to take that into account, and to require platforms to consider that important issue.
New clause 3, on child user empowerment, was mentioned earlier. There is a bizarre loophole in the Bill requiring user empowerment toggles for adults but not for children. It is really odd not to require them for children when we know that they will be able to see some of this content and access features that are much more inherently dangerous to them than to adults. That is why we tabled amendments on private messaging features and live streaming features.
Live streaming is a place where self-generated child sexual abuse has shot through the roof. With child user empowerment, children would have to opt in, and they would have empowerment tools to allow them opportunities to say, “No, I don’t want to be involved in live streaming,” or to allow their parents to say, “No, I don’t want my child to be able to do live streaming when they sign up to Instagram. I don’t want them able to share live photos and to speak to people they don’t know.” Amendment 46, on private messaging features, would allow children to say, “No, I don’t want to get any private messages from anyone I don’t know.” That is not written into terms of service or in the Bill as a potentially harmful thing, but children should be able to exclude themselves from having such conversations.
We have been talking about the relationship between real life and the online world. If a child is playing in a play park and some stranger comes up and talks to them, the child is perfectly within their rights to say, “No, I’m not speaking to strangers. My parents have told me that, and it is a good idea not to speak to strangers,” but they cannot do that in the online world. We are asking for that to be taken into account and for platforms to allow private messaging and live streaming features to be switched off for certain groups of people. If they were switched off for children under 13, that would make Roblox, for example, a far safer place than it currently is.
I turn to amendment 84, on conversion therapy. I am glad that the amendment was tabled and that there are moves by the UK Government to bring forward the conversion therapy ban. As far as I am aware—I have been in the Chamber all day—we have not yet seen that legislation, but I am told that it will be coming. I pay tribute to all those who have worked really hard to get us to the position where the Government have agreed to bring forward a Bill. They are to be commended on that. I am sorry that it has taken this long, but I am glad that we are in that position. The amendment was massively helpful in that.
Lastly, I turn to amendment 50, on the risk of harm. One of the biggest remaining issues with the Bill is about the categorisation of platforms, which is done on the basis of their size and the risk of their features. The size of the platform—the number of users on it—is the key thing, but that fails to take into account very small and incredibly harmful platforms. The amendment would give Ofcom the power to categorise platforms that are incredibly harmful—incel forums, for example, and Kiwi Farms, set up entirely to dox trans people and put their lives at risk—as category 1 platforms and require them to meet all the rules, risk assessments and things for those platforms.
We should be asking those platforms to answer for what they are doing, no matter how few members they have or how small their user base. One person being radicalised on such a platform is one person too many. Amendment 50 is not an extreme amendment saying that we should ban all those platforms, although we probably should. It would ask Ofcom to have a higher bar for them and require them to do more.
I cannot believe that we are here again and that the Bill has taken so long to get to this point. I agree that the Bill is far from perfect, but it is better than nothing. The SNP will therefore not be voting against its Third Reading, because it is marginally better than the situation that we have right now.
I want to say in passing that I support amendments 52 and 53, which stand in the name of my hon. Friend the Member for Stroud (Siobhan Baillie) and others. She will explain them fully so I do not need to, but they seem to be sensible clarifications that I hope the Government will consider favourably.
I thought I might mention to my right hon. and learned Friend that the written ministerial statement, which is now available to the public, makes it clear that useful and constructive discussions have taken place. Much of what he is saying is not necessarily applicable to the state of affairs we are now faced with.
I am grateful to my hon. Friend and I will come on to the written statement. I accept what he says. I think we are heading in the right direction, but since new clause 2 is before us at the moment, it seemed to me that I ought to address it, I hope in a helpful way.
There is nothing in the language of new clause 2 as it stands that requires a breach of the duties to be serious or even more than minimal. We should be more discriminating than that.
The second difficulty with new clause 2, which I hope the Government will pick up when they look at it again, is with prosecuting successfully the sorts of offences we may create. The more substantive and fundamental child safety duties in clause 11, which are to
“mitigate and manage the risks of harm”
and to prevent children encountering harmful content, are expressed in terms of the use of “proportionate measures” or “proportionate systems and processes”. The word “proportionate” is important and describes the need for balanced judgments to be made, including by taking into account freedom of expression and privacy as required by clause 11 itself. Aside from the challenges of obtaining evidence of what individual managers did or did not know, did or said, those balanced judgments could be very difficult for a prosecutor to assess and to demonstrate to a criminal court, to the required standard of proof, were deliberately or negligently wrong.
The consequences of that difficulty could either be that it becomes apparent that the cases are very hard to prosecute, and therefore criminal liability is not the deterrent we hoped for, or that wide criminal liability causes the sort of risk aversion and excessive take-down of material that I know worries my hon. Friend the Member for Stone (Sir William Cash) and others who support new clause 2. We therefore need to calibrate criminal liability appropriately.
It is also worth saying that if we are to pursue an extension of criminal liability, I am not sure that I see the logic of limiting that further criminal liability only to breaches of the child safety duties; I can envisage some breaches of safety duties in relation to illegal content that may also be deserving of such liability.
That leads me on to consider, as has been said, exactly how we might extend criminal liability differently. I appreciate that the Government will now be doing just that. Perhaps they can consider doing so in relation to serious or persistent breaches of the safety duties, rather than in relation to all breaches of safety duties.
Alternatively, or additionally, they could look at individual criminal liability for a failure to comply with a confirmed notice of contravention from Ofcom. I welcome the direction of travel set out in the written ministerial statement, which suggests that that is where the Government may go. As the statement says, the recent Irish legislation that has been prayed in aid does something very similar, and it is an approach with several advantages: it is easier to prove, we will know whether Ofcom has issued a notice requiring action to remedy a deficient approach to the safety duties, and we will know whether Ofcom believes that it has not been responded to adequately.
As we design a new system of regulation in this new era of regulation, we should want open conversations to take place between the regulator and the regulated as to how best to counter harms. Anything that discourages platforms and their directors from doing so may make the system we are designing work less well in promoting safety online. The approach that I think the Government will now consider is unlikely to do that.
Let me say one final thing. As my hon. Friend the Member for Gosport (Dame Caroline Dinenage) said, I have been involved in the progress of this Bill almost from the start, and I am delighted to see present my right hon. Friend the Member for Maidenhead (Mrs May), at whose instruction I started doing it. It has been tortuous progress, no doubt—to some extent that was inevitable because of the difficulty of the Bill and the territory in which we seek to legislate—but the hon. Member for Aberdeen North (Kirsty Blackman), who speaks for the SNP and for whom I have a good deal of respect, was probably a little grudging in suggesting that as it stands the Bill does only slightly better than the status quo. It does a lot more than that.
If we send the Bill to the other place this evening, as I hope we do, and if the other place considers it again with some thoroughness and seeks to improve it further, as I know it will, we will make the internet not a safe place—I do not believe that is achievable—but a significantly safer place. If we can do that, it will be the most important thing that most of us in this place have ever done.