Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 7 months ago)
Lords ChamberI completely understand that; I was making the point that there will be disagreements in judgments. In that instance, it was resolved by a court, but we are talking about a situation where I am not sure how the judgment is made.
In these amendments, there are lists of particular harms—a variety are named, including self-harm—and I wanted to provide some counterexamples of what I consider to be harms. I have been inundated by algorithmic adverts for “Naked Education” on Channel 4, maybe because of the algorithms I am on. I think that the programme is irresponsible; I say that having watched it, rather than just having read a headline. Channel 4 is posing this programme with naked adults and children as educational by saying that it is introducing children to the naked body. I think it is harmful for children and that it should not be on the television, but it is advertised on social media—I have seen quite a lot of it.
The greatest example of self-harm we encounter at present is when gender dysphoric teenagers—as well as some younger than teenagers; they are predominately young women—are affirmed by adults, as a kind of social contagion, into taking body-changing and body-damaging hormones and performing self-mutilation, whether by breast binding or double mastectomies, which is advertised and praised by adults. That is incredibly harmful for young people, and it is reflected online at lot, because much of this is discussed, advertised or promoted online.
This is related to the earlier contributions, because I am asking: should those be added to the list of obvious harms? Although not many noble Lords are in the House now, if there were many more here, they would object to what I am saying by stating, “That is not harmful at all. What is harmful is what you’re saying, Baroness Fox, because you’re causing psychological harm to all those young people by being transphobic”. I am raising these matters because we think we all agree that there is a consensus on what is harmful material online for young people, but it is not that straightforward.
The amendment states that the Bill should target any platform that posts
“links to, or … encourages child users to seek”
out “dangerous or illegal activity”. I understand “illegal activity”, but on “dangerous” activities, I assume that we do not mean extreme sports, mountain climbing and so on, which are dangerous—that comes to mind probably because I have spent too much time with young people who spend their whole time looking at those things. I worry about the unintended consequences of things being banned or misinterpreted in that way.
To respond briefly to the noble Baroness, I shall give a specific example of how Amendment 93 would help. Let us go back to the coroner’s courtroom where the parents of Molly Russell were trying to get the coroner to understand what had happened to their daughter. The legal team from Meta was there, with combined salaries probably in seven figures, and the argument was about the detail of the content. At one point, I recall Ian Russell saying that one of the Meta lawyers said, “We are topic agnostic”. I put it to the noble Baroness that, had the provisions in Amendment 93 been in place, first, under “Content harms” in proposed new paragraph 3(c) and (d), Meta would have been at fault; under “Contact harms” in proposed new paragraph 4(b), Meta would have been at fault; under “Conduct harms” in proposed new paragraph 5(b), Meta would have been at fault; and under “Commercial harms” in proposed new paragraph 6(a) and (b), Meta would have been at fault. That would have made things a great deal simpler.