(1 year, 3 months ago)
Lords ChamberMy Lords, as often, it is a pleasure to follow the noble Baronesses, Lady Harding and Lady Kidron, and to support this group of amendments, especially those to which I put my name. I thank the Minister and the Secretary of State for the many amendments they are introducing, including in the last group, on which I was not able to speak for similar reasons to other noble Lords. I especially note Amendment 1, which makes safety by design the object of the Bill and makes implicit the amendments that we are speaking to this afternoon, each of which is consistent with that object of safety by design running through the Bill.
As others have said, this is an immensely complex Bill, and anything which introduces clarity for the technology companies and the users is to be welcomed. I particularly welcome the list in Amendment 281F, which the noble Baroness, Lady Kidron, has already read aloud and which spells out very clearly the harm which results from functionality as well as content. It is imperative to have that in the Bill.
In Committee, I referred to the inequality of harms between the user of a service and the forces arrayed against them. You may like to imagine a child of eight, 12 or 15 using one of the many apps we are discussing this afternoon. Now imagine the five As as forces arrayed against them; they are all about functionality, not content. We must consider: the genius of the advertising industry, which is designed on a commercial basis for sales and profit; the fact that processes, applications and smartphones mean that there is 24/7 access to those who use these services and that there is no escape from them; the creation of addictions by various means of rewarding particular features, which have little to do with content and everything to do with design and function; the creative use of algorithms, which will often be invisible and undetectable to adult users and certainly invisible to children; and the creation of the generation of more harms through artificial intelligence, deep fakes and all the harms resulting from functionality. Advertising, access, addiction, algorithms and artificial intelligence are multiplying harms in a range of ways, which we have heard discussed so movingly today.
The quantity of harm means the socialisation, normalisation and creation of environments which are themselves toxic online and which would be completely unacceptable offline. I very much hope, alongside others, that the Government will give way on these amendments and build the naming of functionality and harm into the Bill.
My Lords, I will speak, in part, to two amendments with my name on them and which my noble friend Lady Kidron referred to: Amendments 46 and 90 on the importance of dissemination and not just content.
A more effective way of me saying the same thing differently is to personalise it by trying to give your Lordships an understanding of the experience taking place, day in, day out, for many young people. I address this not only to the Minister and the Bill team but, quite deliberately, to the Office of the Parliamentary Counsel. I know full well that the Bill has been many years in gestation and, because the online world, technology and now AI are moving so fast, it is almost impossible for the Bill and its architecture to keep pace with them. But that is not a good reason for not listening to and accepting the force of the argument which my noble friend Lady Kidron and many others have put forward.
Last week, on the first day on Report, when we were speaking to a group of amendments, I spoke to your Lordships about a particular functionality called dark patterns, which are a variety of different features built into the design of these platforms to drive more and more volume and usage.
The individual whose journey I will be describing is called Milly. Milly is online and she accepts an automatic suggestion that is on a search bar. Let us say it is about weight loss. She starts to watch videos that she would not otherwise have found. The videos she is watching are on something called infinite scroll, so one just follows another that follows another, potentially ad infinitum. To start off, she is seeing video after video of people sharing tips about dieting and showing how happy they are after losing weight. As she scrolls and interacts, the women she sees mysteriously seem to get thinner and thinner. The platform’s content dispersal strategy—if indeed it has one, because not all do—that tempers the power of the algorithm has not yet kicked in. The Bill does not address this because, individually, not a single one of the videos Milly has been watching violates the definition of primary priority content. Coding an algorithm to meet a child’s desire to view increasingly thin women is what they are doing.
The videos that Milly sees are captioned with a variety of hashtags such as #thinspo, #thighgap and #extremeweightloss. If she clicks on those, she will find more extreme videos and will start to click on the accounts that have posted the content. Suddenly, she is exposed to the lives of people who are presenting disordered eating not just as normal but as aspirational. Developmentally, Milly is at an age where she does not have the critical thinking skills to evaluate what she is seeing. She has entered a world that she is too young to understand and would never have found were it not for the design of the platform. Throughout her journey thus far, she has yet to see a single video that meets the threshold of primary priority harm content. This world is the result of cumulative design harms.
She follows some of the accounts that prompts the platform to recommend similar accounts. Many of the accounts recommended to her are even more extreme. They are managed by people who have active eating disorders but see what is known as their pro-ana status—that is, pro anorexia—as a lifestyle choice rather than a mental health issue. These accounts are very savvy about the platform’s community guidelines, so the videos and the language they use are coded specifically to avoid detection.
Every aspect of the way Milly is interacting with the platform has now been polluted. It is not just the videos she sees. It is the autocomplete suggestions she gets on searches. It is the algorithmically determined account recommendations. It is the design strategies that make it impossible for her to stop scrolling. It is the notifications she receives encouraging her back to the platform to watch yet another weight-loss video or follow yet another account. It is the filters and effects she is offered before she posts. It is the number of likes her videos get. It goes on and on, and the Bill as it is stands will fail Milly. This is why I am talking directly to the Minister and the Office of the Parliamentary Counsel, because they need to sort this out.
Earlier on this afternoon, before we began this debate, I was talking to an associate professor in digital humanities at UCL, Dr Kaitlyn Regehr. We were talking about incels—involuntary celibates—and the strange world they live in, and she made a comment. This is a quote that I wrote down word for word because it struck me. She said:
“One off-day seeds the algorithm. The algorithm will focus on that and amplify that one off-day”—
that one moment when we click on something and suddenly it takes us into a world and in a direction that we had no idea existed but, more importantly, because of the way these are designed, we feel we have no control over. We really must do something about this.