Thursday 11th July 2019

(5 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

I thank the right reverend Prelate for tabling today’s debate and draw the attention of the House to my interests as set out in the register. I very much welcome the Church of England’s social media guidelines. They have great force in their simplicity and generosity of spirit, and clearly outline our responsibilities to conduct our online interactions respectfully and honestly. I will focus my contribution on how they might be applied to the social media companies themselves.

For example, the first guideline is:

“Be safe. The safety of children, young people and vulnerable adults must be maintained”.


Far from taking reasonable steps to maintain the safety of children or to support their emotional and social development, social media companies refuse to even recognise the global consensus that a child is a person under the age of 18 as codified by the Convention on the Rights of the Child. Tick a box and a child of 13 can gain access to an environment that routinely exposes them to adult risks and deprives them of the rights that we have fought for decades to establish. Furthermore, minimum age limits are routinely bypassed and poorly enforced, a fact freely admitted by both Snap and Facebook when they appeared before Parliament in recent months. This leaves children of all ages unprotected through many of their most vulnerable years. For children to be safe online, social medial companies first have to provide a safe environment.

A similar scenario unfolds when you consider the guideline:

“Be honest. Don’t mislead people about who you are”.


The spread of misinformation and disinformation polarises debate, impacts on elections, drives the rise in intolerance and fuels spurious health claims and conspiracy theories. This is an area of considerable attention for legislators around the globe but, while much is said about those who create the misinformation, it is important to note that the platforms are not neutral bystanders. In an attention economy where clicks mean money, and the longer that someone stays on line the more you maximise your opportunity to serve them an ad or learn something about them that you can sell later, the spread of the extraordinary, the extreme or the loud is not an unintended consequence of your service; it becomes central to its purpose.

Being honest is not only about information but about the nature of the service itself. When we walk into a tea room, a cinema, a pub or a strip club, we understand the opportunities and risks that those environments offer and are given nuanced indicators about their suitability for ourselves or our children. Social media companies, by contrast, parade as tea rooms but behave like strip clubs. A simple answer would be greater honesty about what the nature of the service holds.

This leads me quite neatly to the guidance to,

“Follow the rules. Abide by the terms and conditions”.


Terms and conditions should enable users to decide whether a service is offering them an environment that will treat them fairly. They are, by any measure, a contract between user and platform; it is therefore unacceptable that these published rules are so opaque, so asymmetrical in the distribution of rights and responsibilities, so interminably long—and then so inconsistently and poorly upheld by the platforms themselves.

This failure to follow the rules is not without consequence. Noble Lords will remember the case of Molly Russell, who took her own life in 2017 after viewing and being auto-recommended graphic self-harm and suicide content. The spokesperson for one of the platforms responsible, Pinterest, said:

“Our existing self-harm policy already does not allow for anything that promotes self-harm. However, we know a policy isn’t enough. What we do is more important than what we say”.


Indeed, and while that tragedy has been widely and bravely publicised by Molly’s father, it is neither the only tragedy nor the only failure. Failure is built into the system. The responsibility for upholding terms and conditions must be a two-way street. I warmly welcome the Government’s proposal in the online harms White Paper:

“The regulator will assess how effectively these terms are enforced as part of any regulatory action”,


and I welcome the Information Commissioner’s similar commitment in the recently published age-appropriate design code.

Let me finish with this. On Monday, 22 children came to the House to see me and offer their thoughts on a 5Rights data literacy workshop that they had been doing for some months. Their observations can be usefully summed up by the fifth of the Church’s guidelines:

“Take responsibility. You are accountable for the things you do”.


These children and young people categorically understood their responsibilities, but they powerfully and explicitly expressed the requirement for the platforms to meet theirs too. It is for the platforms to make their services safe and respectful, for government to put in place the unavoidable requirement that they do so, and for the rest of us to keep speaking up until it is done. With that in mind, I commend the right reverend Prelate for his tireless work to that end and ask the Minister to reassure the House that the promises made to children and parents by the outgoing Executive will be implemented by the incoming Executive.