Online Safety Bill Debate
Full Debate: Read Full DebateLord Russell of Liverpool
Main Page: Lord Russell of Liverpool (Crossbench - Excepted Hereditary)Department Debates - View all Lord Russell of Liverpool's debates with the Department for Digital, Culture, Media & Sport
(1 year, 4 months ago)
Lords ChamberMy Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.
As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.
In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.
The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.
My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.
A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.
Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.
The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.
The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.
She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.
Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.
Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:
“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—
that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.
My Lords, I rise to support the amendments in the names of the intrepid noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, the noble Baroness, Lady Harding, and the right reverend Prelate the Bishop of Oxford. They fit hand in hand with the amendments that have just been debated in the previous group. Sadly, I was unable to take part in that debate because of a technical ruling, but I thank the Minister for his kind words and thank other noble Lords for what they have said. But my heart is broken, because they included age verification, for which I have campaigned for the past 12 years, and I wanted to thank the Government for finally accepting that children need to be protected from online harmful content, pornography being one example; it is the gateway to many other harms.
My Lords, the final issue I raised in Committee is dealt with in this group on so-called proportionality. I tabled amendments in Committee to ensure that under Part 3 no website or social media service with pornographic content could argue that it should be exempt from implementing age verification under Clause 11 because to do so would be disproportionate based on its size and capacity. I am pleased today to be a co-signatory to Amendment 39 tabled by the noble Lord, Lord Bethell, to do just that.
The noble Lord, Lord Russell, and the noble Baroness, Lady Kidron, have also tabled amendments which raise similar points. I am disappointed that despite all the amendments tabled by the Minister, the issue of proportionality has not been addressed; maybe he will give us some good news on that this evening. It feels like the job is not quite finished and leaves an unnecessary and unhelpful loophole.
I will not repeat all the arguments I made in Committee in depth but will briefly recap that we all know that in the offline world, we expect consistent regulation regardless of size when it comes to protecting children. We do not allow a small corner shop to act differently from a large supermarket on the sale of alcohol or cigarettes. In a similar online scenario, we do not expect small or large gambling websites to regulate children’s access to gambling in a different way.
We know that the impact of pornographic content on children is the same whether it is accessed on a large pornographic website or a small social media platform. We know from the experience of France and Germany that pornographic websites will do all they can to evade age verification. As the noble Lord, Lord Stevenson, said on the eighth day of Committee, whether pornography
“comes through a Part 3 or Part 5 service, or accidently through a blog or some other piece of information, it has to be stopped. We do not want our children to receive it. That must be at the heart of what we are about, and not just something we think about as we go along”.—[Official Report, 23/5/23; col. 821.]
By not shutting off the proportionality argument, the Government are allowing different-sized online services to act differently on pornography and all the other primary priority content, as I raised in Committee. At that stage, the noble Baroness, Lady Kidron, said,
“we do not need to take a proportionate approach to pornography”.—[Official Report, 2/5/23; col. 1481.]
Amendment 39 would ensure that pornographic content is treated as a separate case with no loopholes for implementing age verification based on size and capacity. I urge the Minister to reflect on how best we can close this potential loophole, and I look forward to his concluding remarks.
My Lords, I will briefly address Amendments 43 and 87 in my name. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to these amendments. They are complementary to the others in this group, on which the noble Lord, Lord Bethell, and the noble Baroness, Lady Ritchie, have spoken.
In Committee the Minister argued that it would be unfair to place the same child safety duties across all platforms. He said:
“This provision recognises that what it is proportionate to require of providers at either end of that scale will be different”.—[Official Report, 2/5/23; col. 1443.]
Think back to the previous group of amendments we debated. We talked about functionality and the way in which algorithms drive these systems. They drive you in all directions—to a large platform with every bell and whistle you might anticipate because it complies with the legislation, but also, willy-nilly, without any conscious thought because that is how it is designed, to a much smaller site. If we do not amend the legislation as it stands, they will take you to smaller sites that do not require the same level of safety duties, particularly towards children. I think we all fail to understand the logic behind that argument.