All 1 Debates between Baroness Kidron and Lord Russell of Liverpool

Thu 6th Jul 2023
Online Safety Bill
Lords Chamber

Report stage: Part 2

Online Safety Bill

Debate between Baroness Kidron and Lord Russell of Liverpool
Lord Russell of Liverpool Portrait Lord Russell of Liverpool (CB)
- View Speech - Hansard - - - Excerpts

My Lords, I will speak to my Amendments 281 to 281B. I thank the noble Baronesses, Lady Harding and Lady Kidron, and the noble Lord, Lord Knight, for adding their names to them. I will deal first with Amendments 281 and 281B, then move to 281A.

On Amendments 281 and 281B, the Minister will recall that in Committee we had a discussion around how functionality is defined in the Bill and that a great deal of the child risk assessments and safety duties must have regard to functionality, as defined in Clause 208. However, as it is currently written, this clause appears to separate out functionalities of user-to-user services and search services. These two amendments are designed to adjust that slightly, to future-proof the Bill.

Why is this necessary? First, it reflects that it is likely that in the future, many of the functionalities that we currently see on user-to-user services will become present on search services and possibly vice versa. Therefore, we need to try to take account of how the world is likely to move. Secondly, this is already happening, and it poses a risk to children. Some research done by the 5Rights Foundation has found that “predictive search”, counted in the Bill as a search service functionality, is present on social media websites, leading one child user using a search bar to be presented in nanoseconds with prompts associated with eating disorders. In Committee, the Minister noted that the functionalities listed in this clause are non-exhaustive. At the very least, it would be helpful to clarify this in the Bill language.

Amendment 281A would add specific functionalities which we know are addictive or harmful to children and put them in the Bill. We have a great deal of research and evidence which demonstrates how persuasive certain design strategies are with children. These are features which are solely designed to keep users on the platform, at any cost, as much as possible and for as long as possible. The more that children are on the platform, the more harm they are likely to suffer. Given that the purpose of this Bill is for services to be safe by design, as set out usefully in Amendment 1, please can we make sure that where we know—and we do know—that risk exists, we are doing our utmost to tackle it?

The features that are listed in this amendment are known as “dark patterns”—and they are known as “dark patterns” for a very good reason. They have persuasive and pervasive design features which are deliberately baked into the design of the digital services and products, to capture and hold, in this case, children’s attention, and to create habitual, even compulsive behaviours. The damage this does to children is proven and palpable. For example, one of the features mentioned is infinite scroll, which is now ubiquitous on most major social media platforms. The inventor of infinite scroll, a certain Aza Raskin, who probably thought it was a brilliant idea at the time, has said publicly that he now deeply regrets ever introducing it, because of the effect it is having on children.

One of the young people who spoke to the researchers at 5Rights said of the struggle they have daily with the infinite scroll feature:

“Scrolling forever gives me a sick feeling in my stomach. I’m so aware of how little control I have and the feeling of needing to be online is overwhelming and consuming”.


Features designed to keep users—adults, maybe fine, but children not fine—online at any cost are taking a real toll. Managing public and frequent interactions online, which the features encourage, creates the most enormous pressures for young people, and with that comes anxiety, low self-esteem and mental health challenges. This is only increasing, and unless we are very specific about these, they are going to continue.

We have the evidence. We know what poses harm and risk to children. Please can we make sure that this is reflected accurately in the Bill?

Baroness Kidron Portrait Baroness Kidron (CB)
- View Speech - Hansard - -

My Lords, I rise briefly to support many of the amendments in this group. I will start with Amendments 281, 281A and 281B in the name of my noble friend Lord Russell, to which I have added my name. The noble Lord set out the case very well. I will not reiterate what he said, but it is simply the case that the features and functionalities of regulated companies should not be separated by search and user-to-user but should apply across any regulated company that has that feature. There is no need to worry about a company that does not have one of the features on the list, but it is a much more dangerous thing to have an absent feature than it is to have a single list and hold companies responsible for their features.

Only this morning, Meta released Thread as its challenger to Twitter. In the last month, Snapchat added generative AI to its offering. Instagram now does video, and TikTok does shopping. All these companies are moving into a place where they would like to be the one that does everything. That is their commercial endgame, and that is where the Bill should set its sights.

Separating out functionality and, as the noble Lord, Lord Russell, said, failing to add what we already know, puts the Bill in danger of looking very old before the ink is dry. I believe it unnecessarily curtails Ofcom in being able to approach the companies for what they are doing, rather than for what the Bill thought they might be doing at this point. So, if the Minister is not in a position to agree to the amendment, I urge him at least to take it away and have a look at it, because it is a technical rather than an ideological matter. It would be wonderful to fix it.