All 1 Debates between Lord Bishop of Oxford and Lord Allan of Hallam

Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2

Online Safety Bill

Debate between Lord Bishop of Oxford and Lord Allan of Hallam
Lord Bishop of Oxford Portrait The Lord Bishop of Oxford
- View Speech - Hansard - -

My Lords, I too welcome these amendments and thank the Minister and the Government for tabling them. The Bill will be significantly strengthened by Amendment 172 and related amendments by putting the harms as so clearly described in the Bill. I identify with the comments of others that we also need to look at functionality. I hope we will do that in the coming days.

I also support Amendment 174, to which I added my name. Others have covered proposed new subsection (9B) very well; I add my voice to those encouraging the Minister to give it more careful consideration. I will also speak briefly to proposed new subsection (9A), on misinformation and disinformation content. With respect to those who have spoken against it and argued that those are political terms, I argue that they are fundamentally ethical terms. For me, the principle of ethics and the online world is not the invention of new ethics but finding ways to acknowledge and support online the ethics we acknowledge in the offline world.

Truth is a fundamental ethic. Truth builds trust. It made it into the 10 commandments:

“You shall not bear false witness against your neighbour”.


It is that ethic that would be translated across in proposed new subsection (9A). One of the lenses through which I have viewed the Bill throughout is the lens of my eight grandchildren, the oldest of whom is eight years old and who is already using the internet. Proposed new subsection (9A) is important to him because, at eight years old, he has very limited ways of checking out what he reads online—fewer even than a teenager. He stands to be fundamentally misled in a variety of ways if there is no regulation of misinformation and disinformation.

Also, the internet, as we need to keep reminding ourselves in all these debates, is a source of great potential good and benefit, but only if children grow up able to trust what they read there. If they can trust the web’s content, they will be able to expand their horizons, see things from the perspective of others and delve into huge realms of knowledge that are otherwise inaccessible. But if children grow up necessarily imbued with cynicism about everything they read online, those benefits will not accrue to them.

Misinformation and disinformation content is therefore harmful to the potential of children across the United Kingdom and elsewhere. We need to guard against it in the Bill.

Lord Allan of Hallam Portrait Lord Allan of Hallam (LD)
- View Speech - Hansard - - - Excerpts

My Lords, Amendment 172 is exceptionally helpful in putting the priority harms for children on the face of the Bill. It is something that we have asked for and I know the pre-legislative scrutiny committee asked for it and it is good to see it there. I want to comment to make sure that we all have a shared understanding of what this means and that people out there have a shared understanding.

My understanding is that “primary priority” is, in effect, a red light—platforms must not expose children to that content if they are under 18—while “priority” is rather an amber light and, on further review, for some children it will be a red light and for other children it be a green light, and they can see stuff in there. I am commenting partly having had the experience of explaining all this to my domestic focus group of teenagers and they said, “Really? Are you going to get rid of all this stuff for us?” I said, “No, actually, it is quite different”. It is important in our debate to do that because otherwise there is a risk that the Bill comes into disrepute. I look at something like showing the harms to fictional characters. If one has seen the “Twilight” movies, the werewolves do not come off too well, and “Lord of the Rings” is like an orc kill fest.

As regards the point made by the noble Baroness, Lady Harding, about going to the cinema, we allow older teenagers to go to the cinema and see that kind of thing. Post the Online Safety Bill, they will still be able to access it. When we look at something like fictional characters, the Bill is to deal with the harm that is there and is acknowledged regarding people pushing quite vile stuff, whereby characters have been taken out of fiction and a gory image has been created, twisted and pushed to a younger child. That is what we want online providers to do—to prevent an 11 year-old seeing that—not to stop a 16 year-old enjoying the slaughter of werewolves. We need to be clear that that is what we are doing with the priority harms; we are not going further than people think we are.

There are also some interesting challenges around humour and evolving trends. This area will be hard for platforms to deal with. I raised the issue of the Tide pod challenge in Committee. If noble Lords are not familiar, it is the idea that one eats the tablets, the detergent things, that one puts into washing machines. It happened some time ago. It was a real harm and that is reflected here in the “do not ingest” provisions. That makes sense but, again talking to my focus group, the Tide pod challenge has evolved and for older teenagers it is a joke about someone being stupid. It has become a meme. One could genuinely say that it is not the harmful thing that it was. Quite often one sees something on the internet that starts harmful—because kids are eating Tide pods and getting sick—and then over time it becomes a humorous meme. At that point, it has ceased to be harmful. I read it as that filter always being applied. We are not saying, “Always remove every reference to Tide pods” but “At a time when there is evidence that it is causing harm, remove it”. If at a later stage it ceases to be harmful, it may well move into a category where platforms can permit it. It is a genuine concern.

To our freedom of expression colleagues, I say that we do not want mainstream platforms to be so repressive of ordinary banter by teenagers that they leave those regulated mainstream platforms because they cannot speak any more, even when the speech is not harmful, and go somewhere else that is unregulated—one of those platforms that took Ofcom’s letter, screwed it up and threw it in the bin. We do not want that to be an effect of the Bill. Implementation has to be very sensitive to common trends and, importantly, as I know the noble Baroness, Lady Kidron, agrees, has to treat 15, 16 and 17 year-olds very differently from 10, 11 or 12 year-olds. That will be hard.

The other area that jumped out was about encouraging harm through challenges and stunts. That immediately brought “Jackass” to mind, or the Welsh version, “Dirty Sanchez”, which I am sure is a show that everyone in the House watched avidly. It is available on TV. Talking about equality, one can go online and watch it. It is people doing ridiculous, dangerous things, is enjoyed by teenagers and is legal and acceptable. My working assumption has to be that we are expecting platforms to distinguish between a new dangerous stunt such as the choking game—such things really exist—from a ridiculous “Jackass” or “Dirty Sanchez” stunt, which has existed for years and is accessible elsewhere.

The point that I am making in the round is that it is great to have these priority harms in the Bill but it is going to be very difficult to implement them in a meaningful way whereby we are catching the genuinely harmful stuff but not overrestricting. But that is that task that we have set Ofcom and the platforms. The more that we can make it clear to people out there what we are expecting to happen, the better. We are not expecting a blanket ban on all ridiculous teenage humour or activity. We are expecting a nuanced response. That is really helpful as we go through the debate.