Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

I want to answer the point that amendments cannot be seen in isolation. Noble Lords will remember that we had a long and good debate about what constituted harms to children. There was a big argument and the Minister made some warm noises in relation to putting harms to children in the Bill. There is some alignment between many people in the Chamber whereby we and Parliament would like to determine what harm is, and I very much share the noble Baroness’s concern about pointing out what that is.

On the issue of the system versus the content, I am not sure that this is the exact moment but the idea of unintended consequences keeps getting thrown up when we talk about trying to point the finger at what creates harm. There are unintended consequences now, except neither Ofcom nor the Secretary of State or Parliament but only the tech sector has a say in what the unintended consequences are. As someone who has been bungee jumping, I am deeply grateful that there are very strict rules under which that is allowed to happen.

Lord Bishop of Gloucester Portrait The Lord Bishop of Gloucester
- View Speech - Hansard - -

My Lords, I support the amendments in this group that, with regard to safety by design, will address functionality and harms—whatever exactly we mean by that—as well as child safety duties and codes of practice. The noble Lord, Lord Russell, and the noble Baronesses, Lady Harding and Lady Kidron, have laid things out very clearly, and I wish the noble Baroness, Lady Kidron, a happy birthday.

I also support Amendment 261 in the name of my right reverend friend the Bishop of Oxford and supported by the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville. This amendment would allow the Secretary of State to consider safety by design, and not just content, when reviewing the regime.

As we have heard, a number of the amendments would amend the safety duties to children to consider all harms, not just harmful content, and we have begun to have a very interesting debate on that. We know that service features create and amplify harms to children. These harms are not limited to spreading harmful content; features in and of themselves may cause harm—for example, beautifying filters, which can create unrealistic body ideals and pressure on children to look a certain way. In all of this, I want us to listen much more to the voices of children and young people—they understand this issue.

Last week, as part of my ongoing campaign on body image, including how social media can promote body image anxiety, I met a group of young people from two Gloucestershire secondary schools. They were very good at saying what the positives are, but noble Lords will also be very familiar with many of the negative issues that were on their minds, which I will not repeat here. While they were very much alive to harmful content and the messages it gives them, they were keen to talk about the need to address algorithms and filters that they say feed them strong messages and skew the content they see, which might not look harmful but, because of design, accentuates their exposure to issues and themes about which they are already anxious. Suffice to say that underpinning most of what they said to me was a sense of powerlessness and anxiety when navigating the online world that is part of their daily lives.

The current definition of content does not include design features. Building in a safety by design principle from the outset would reduce harms in a systematic way, and the amendments in this group would address that need.

Baroness Fraser of Craigmaddie Portrait Baroness Fraser of Craigmaddie (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I support this group of amendments. Last week, I was lucky—that is not necessarily the right word—to participate in a briefing organised by the noble Lord, Lord Russell of Liverpool, with the 5Rights Foundation on its recent research, which the noble Lord referred to. As the mother of a 13 year-old boy, I came away wondering why on earth you would not want to ensure safety by design for children.

I am aware from my work with disabled children that we know, as Ofcom knows from its own research, that children—or indeed anyone with a long-term health impact or a disability—are far more likely to encounter and suffer harm online. As I say, I struggle to see why you would not want to have safety by design.

This issue must be seen in the round. In that briefing we were taken through how quickly you could get from searching for something such as “slime” to extremely graphic pornographic content. As your Lordships can imagine, I went straight back to my 13 year-old son and said, “Do you know about slime and where you have you seen it?” He said, “Yes, Mum, I’ve watched it on YouTube”. That echoes the point made by the noble Baroness, Lady Kidron—to whom I add my birthday wishes—that these issues have to be seen in the round because you do not just consume content; you can search on YouTube, shop on Google, search on Amazon and all the rest of it. I support this group of amendments.