Baroness Benjamin Portrait Baroness Benjamin (LD)
- View Speech - Hansard - - - Excerpts

My Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.

The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:

“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]


This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.

The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.

Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.

It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.

The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, in keeping with the Stevenson-Knight double act, I am leaving it to my noble friend to wind up the debate. I will come in at this point with a couple of questions and allow the Minister to have a bit of time to reflect on them. In doing so, I reinforce my support for Amendment 295 in the name of the noble Lord, Lord Russell, which refers to volume and frequency also being risk factors.

When I compare Amendment 20 with Clause 10(6), which refers to children’s risk assessments and what factors should be taken into account in terms of the risk profile, I see some commonality and then some further things which Amendment 20, tabled by the noble Baroness, Lady Kidron, adds. In my opinion, it adds value. I am interested in how the Minister sees the Bill, as it stands currently, covering some issues that I will briefly set out. I think it would be helpful if the Committee could understand that there may be ways that the Bill already deals with some of the issues so wonderfully raised by the noble Baroness; it would be helpful if we can flush those out.

I do not see proposed new subsection (b)(iii),

“risks which can build up over time”,

mentioned in the Bill, nor explicit mention of proposed new subsection (b)(iv),

“the ways in which level of risks can change when experienced in combination with others”,

which I think is critical in terms of the way the systems work. Furthermore, proposed new subsection (b)(vii),

“the different ways in which the service is used including but not limited to via virtual and augmented reality technologies”,

starts to anticipate some other potential harms that may be coming very rapidly towards us and our children. Again, I do not quite see it included. I see “the design of functionalities”, “the business model” and “the revenue model”. There is a lot about content in the original wording of the Bill, which is less so here, and, clearly, I do not see anything in respect of the UN Convention on the Rights of the Child, which has been debated in separate amendments anyway. I wanted to give the Minister some opportunity on that.

Lord Bethell Portrait Lord Bethell (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I restate my commitment to Amendments 20, 93 and 123, which are in my name and those of the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Oxford, and the noble Lord, Lord Stevenson, and the noble Baroness’s Amendment 74. It is a great honour to follow the noble Lord, Lord Knight. He put extremely well some key points about where there are gaps in the existing Bill. I will build on why we have brought forward these amendments in order to plug these gaps.

In doing so, I wish to say that it has been a privilege to work with the right reverend Prelate, the noble Baroness and the noble Lord, Lord Stevenson. We are not from the same political geographies, but that collaboration demonstrates the breadth of the political concern, and the strength of feeling across the Committee, about these important gaps when it comes to harms—gaps that, if not addressed, will put children at great risk. In this matter we are very strongly united. We have been through a lot together, and I believe this unlikely coalition demonstrates how powerful the feelings are.

It has been said before that children are spending an increasing amount of their lives online. However, the degree of that inflection point in the last few years has been understated, as has how much further it has got to go. The penetration of mobile phones is already around 75% of 10 year-olds—it is getting younger, and it is getting broader.

In fact, the digital world is totally inescapable in the life of a child, whether that is for a young child who is four to six years old or an older child who is 16 or 17. It is increasingly where they receive their education—I do not think that is necessarily a good thing, but that is arguable—it is where they establish and maintain their personal relationships and it is a key forum for their self-expression.

For anyone who suspects otherwise, I wish to make it clear that I firmly believe in innovation and progress, and I regard the benefits of the digital world as really positive. I would never wish to prevent children accessing the benefits of the internet, the space it creates for learning and building community, and the opportunities it opens for them. However, environments matter. The digital world is not some noble wilderness free from original sin or a perfect, frictionless marketplace where the best, nicest, and most beautiful ideas triumph. It is a highly curated experience defined by the algorithms and service agreements of the internet companies. That is why we need rules to ensure that it is a safe space for children.

I started working on my first internet business in 1995, nearly 30 years ago. I was running the Ministry of Sound, and we immediately realised that the internet was an amazing way of getting through to young people. Our target audiences were either clubbers aged over 18 or the younger brothers and sisters of clubbers who bought our merchandise. The internet gave us an opportunity to get past all the normal barriers—past parents and regulation to reach a wonderful new market. I built a good business and it worked out well for me, but those were the days before GDPR and what we understand from the internet. I know from my experience that we need to ensure that children are protected and shielded from the harms that bombard them, because there are strong incentives—mainly financial but also other, malign incentives—for bad actors to use the internet to get through to children.

Unfortunately, as the noble Baroness, Lady Kidron, pointed out, the Bill as it stands does not achieve that aim. Take, for example, contact harms, such as grooming and child sexual abuse. In February 2020, Bark, a US-based organisation that helps families manage and protect their children’s digital lives, launched an 11 year-old persona online who it called Bailey. Bailey’s online persona clearly shows that she is an ordinary 11 year-old, posting content that is ordinary for an 11 year-old. Within 30 seconds of her persona being launched online she received a like from a man whose profile picture was a penis. Within two minutes, multiple messages were received from men, and within five minutes a video call. Shortly afterwards, she received requests from men to meet up. I remind your Lordships that Bailey was 11 years old. These are not trivial content harms; these are attempts to contact a minor using the internet as a medium.

--- Later in debate ---
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - - - Excerpts

Where are the commercial harms? I cannot totally get my head around my noble friend’s definition of content. I can sort of understand how it extends to conduct and contact, but it does not sound as though it could extend to the algorithm itself that is driving the addictive behaviour that most of us are most worried about.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

In that vein, will the noble Lord clarify whether that definition of content does not include paid-for content?

Lord Parkinson of Whitley Bay Portrait Lord Parkinson of Whitley Bay (Con)
- Hansard - - - Excerpts

I was about to list the four Cs briefly in order, which will bring me on to commercial or contract risk. Perhaps I may do that and return to those points.

I know that there have been concerns about whether the specific risks highlighted in the new schedule will be addressed by the Bill. In terms of the four Cs category of content risks, there are specific duties for providers to protect children from illegal content, such as content that intentionally assists suicide, as well as content that is harmful to children, such as pornography. Regarding conduct risks, the child safety duties cover harmful conduct or activity such as online bullying or abuse and, under the illegal content safety duties, offences relating to harassment, stalking and inciting violence.

With regard to commercial or contract risks, providers specifically have to assess the risks to children from the design and operation of their service, including their business model and governance under the illegal content and child safety duties. In relation to contact risks, as part of the child safety risk assessment, providers will need specifically to assess contact risks of functionalities that enable adults to search for and contact other users, including children, in a way that was set out by my noble friend Lord Bethell. This will protect children from harms such as harassment and abuse, and, under the illegal content safety duties, all forms of child sexual exploitation and abuse, including grooming.