Social Media: Deaths of Children

(Limited Text - Ministerial Extracts only)

Read Full debate
Thursday 20th January 2022

(2 years, 11 months ago)

Grand Committee
Read Hansard Text
Lord Parkinson of Whitley Bay Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Parkinson of Whitley Bay) (Con)
- Hansard - - - Excerpts

My Lords, I start by thanking the noble Baroness, Lady Kidron, for tabling this important debate, and for beginning by setting out the personal and often harrowing examples that should be uppermost in our minds as we debate these important issues. I am grateful, too, for her drawing to my attention the 10-minute rule Bill introduced in another place on Tuesday by Ian Paisley MP. I have read the transcript of his remarks, in addition to listening to the contributions of noble Lords today.

Her Majesty’s Government share the concerns raised by noble Lords today about the risks to children of harmful content and activity online, including in social media. Although many children have a positive experience online—it is important to remember that—it is clear that the presence of harmful material, and in particular content promoting suicide or self-harm, can have a serious impact on children’s mental health and well-being. The noble Baroness, Lady Boycott, was right to point to the fragility and vulnerability of young people, including adolescents and people well into their teens.

Sadly, we know from research, from coroners’ reports and from colleagues in the police that harmful online content, including that seen in social media, is playing an increasing role in individual suicides. In addition, figures from 2020 show that 40% of 12 to 15 year-olds who are concerned about and have experienced content promoting self-harm cite social media as the source. There is also evidence that gangs are using social media to promote gang culture, to taunt each other, and to incite violence.

The Government are determined to hold social media and other platforms to account for this harmful content, and to make the UK the safest place to be a child online. A key part of that is the online safety Bill, which, as noble Lords know, we published in draft last May. For the first time, under that Bill platforms will have a clear legal responsibility for keeping their users safe online. Platforms will have to understand the risk of harm to their users, and put systems and processes in place that improve their users’ safety.

All companies within the scope of the Bill will have to remove and limit the spread of illegal content, such as material that encourages or assists suicide, and take steps to prevent similar material from appearing. The largest tech companies will also be held to account for tackling activity and content harmful to adults who use their service. Under the new laws, those companies will have to set out clearly what content is acceptable on their platforms, and enforce their terms and conditions consistently. That will enable us to address many of the questions raised by my noble friend Lord Balfe, and to hold companies to account.

The strongest protections in the legislation will be for children. Services likely to be accessed by children will need to conduct a child safety risk assessment and provide safety measures for their child users against harmful and age-inappropriate content. Platforms likely to be accessed by children must consider the risks that children on their services face from priority harmful content—that will be set out in secondary legislation—and any other content they may identify that could cause harm to children. They will also need to consider the risk of harm from the design and operation of their systems.

We expect priority harms for children to include age-inappropriate material, such as pornography and extreme violence, and harmful material such as that which promotes eating disorders and self-harm, as well as cyberbullying. Ahead of designating the “priority harms”, which will be in scope of the legislation, the Government have commissioned research to build the evidence base on harms to children online. This research will review the prevalence and impact of a wide range of harmful content to ensure that the legislation adequately protects children from content that is harmful to them. Ofcom will have a duty to advise the Government on priority categories of harm to children and will also want to draw on evidence and views from relevant parties. That includes Barnardo’s, as raised by the noble Baroness, Lady Benjamin. I am pleased to say that my honourable friend the Minister for Tech and the Digital Economy has already met Barnardo’s in that regard.

The regulator, Ofcom, will set out the steps that companies can take to comply with their duties through statutory codes of practice. Platforms will then be required to put in place systems and processes to mitigate the risks that they have identified. Ofcom will hold companies to account both on the standard of their risk assessments and on the safety measures that they adopt and can take enforcement measures if either of these fall short of what is expected. The approach that we are taking means that children will be much less likely to encounter harmful content in the first place and platforms will no longer, for example, be able to target harmful material at children through the use of algorithms, as the noble Baroness, Lady Kidron, mentioned.

The noble Baroness, Lady Benjamin, asked why the Government cannot in the meantime bring in Part 3 of the Digital Economy Act. The Government have taken the decision to deliver the objective of protecting children from online pornography through the online safety Bill, which we are confident will provide much greater protection to children than Part 3 of the Digital Economy Act would, as it also covers social media companies, where a considerable quantity of pornographic material is available to children at the moment. It would also not be as quick a solution as I think the noble Baroness imagines to commence Part 3 of the Digital Economy Act as an interim measure. The Government would have to designate a new regulator and that regulator would need to produce and consult on statutory guidance. The Government would then need to lay regulations before Parliament ahead of any new regime coming into force. That is why we are keen, as noble Lords today have said they are as well, to do this through the online safety Bill and to do it swiftly.

We expect companies in the scope of the duties of the online safety Bill to use age-assurance technologies to prevent children from accessing content that poses the highest risk of harm. Standards have an important role to play in delivering that and Ofcom will be able to include standards for age assurance as part of its regulatory codes. Companies will either need to follow the steps in the codes, including using these standards, or demonstrate that they achieved an equivalent outcome.

The noble Baroness, Lady Kidron, asked whether the Bill would make reference to the United Nations Convention on the Rights of the Child. I cannot pre-empt the Government’s response in full to the Joint Committee on which she served, but I note in the meantime that the Bill reflects the three principles of the general comments: for the best interests of the child to be a primary consideration; on children’s right to life, survival and development; and respect for the views of the child. Of course, on that and all the recommendations, the Government will respond in full to the Joint Committee, for whose work we are very grateful.

As the noble Lord, Lord Addington, says, regulation of this nature will require effective enforcement. We have confirmed our decision to appoint Ofcom as the regulator and our intention to give it a range of enforcement powers, which will include substantial fines and, in the most serious cases, blocking. There will also be a criminal offence for senior managers who fail to ensure that their company complies with Ofcom’s information requests, to push strong compliance in this area. Ofcom will also be required to set out in enforcement guidance how it will take into account any impact on children due to a company’s failure to fulfil its duty of care.

The Bill will apply to companies that host user-generated content or enable user-to-user interaction, as well as to search services. We have taken this approach to ensure that the Bill captures the services that pose the greatest risk of harm to users and where there is current limited regulatory oversight, without placing disproportionate regulatory burdens elsewhere.

I know that the noble Baroness and the Joint Committee have recommended aligning the scope of these measures with that of the age-appropriate design code. We are grateful for their consideration of this important issue as well. It is vital that any approach is proportionate and remains workable for businesses and Ofcom to ensure that the framework is as effective as possible. We are carefully considering the Joint Committee’s recommendations and are committed to introducing the Bill as soon as possible in this parliamentary Session. In the meantime, we are working closely with Ofcom to ensure that the implementation of the framework is as swift as possible, following passage of the legislation.

I will say a bit more about the interim measures that we are taking, as noble Lords rightly asked about that. We have a comprehensive programme of work to protect children online until the legislation is in force. Ahead of the Bill, the video-sharing platform and video-on-demand regimes are already in force, with Ofcom as the regulator. They include requirements to protect children from harmful online content such as pornography. In addition, the Government have published an interim code of practice for providers to tackle online child sexual exploitation and abuse.

The noble Baroness, Lady Prashar, mentioned our work in asking the Law Commission to review existing legislation on abusive and harmful communications. The Law Commission has published its final report, putting forward recommendations for reform. These include a recommended new offence to tackle communications that encourage or incite self-harm. The Government are considering the recommendations and will set out our position in due course.

As the noble and right reverend Lord, Lord Harries of Pentregarth, said, every death is sad—many are tragic, but they are incredibly so when they involve a young person. The Government recognise the difficulties that some bereaved parents have experienced when accessing their loved ones’ data. Disclosure of data relating to a deceased person is not prevented by the UK’s data protection legislation. As the noble Lord, Lord Allan of Hallam, noted, some companies operate policies of non-disclosure to third parties, including parents, unless a user has taken active steps to nominate a person who may access his or her account after he or she dies or if there is a legal obligation to disclose the data.

We are discussing this issue with companies directly. Officials met Instagram on 22 December, for instance. We are also in discussion with the Information Commissioner’s Office about digital assets. It is important to recognise, as the Joint Committee did, that an automatic right of access is unlikely to be appropriate in every case. Some people might be concerned about the disclosure of private information or other digital assets to third parties after their death.

The Government are grateful to the Joint Committee for its recommendations in this area. While I cannot make any commitment or pre-empt the Government’s response in full, I am happy to say that we will continue to give careful consideration to this before we respond and outline our proposed next steps.

It is worth noting that coroners already have statutory powers to require evidence to be given or documents to be produced for the purpose of their inquests—this would include relevant digital data following the death of a child—with sanctions where such evidence is not given or documents produced. They are well aware of these powers.

The right reverend Prelate the Bishop of St Albans mentioned his Private Member’s Bill. As he knows, the Coroners and Justice Act 2009 is clear that it is beyond a coroner’s powers to determine why somebody died. Coroners’ investigations are about determining who died, how, when and where, but not why. However, he is right that more can be done to understand some of those circumstances. We recognise that quality information on the circumstances leading to self-harm and suicide can support better interventions to prevent them in the first place. The Department for Health and Social Care is considering including questions on gambling as part of the adult psychiatric morbidity survey this year to help establish the prevalence of suicidal tendencies linked to gambling and to improve its evidence base. As the right reverend Prelate knows, we are taking a close look at the Gambling Commission’s powers as part of our review of the Gambling Act.

The Government are deeply concerned about the impact of harmful content and activity online on children. We are committed to introducing legislation as soon as possible to ensure that platforms are held to account for this content so that future generations can have a healthy relationship with the internet. I look forward to debating that Bill when it comes to this House. In the meantime, I thank noble Lords for their contributions to today’s debate.