(3 days, 19 hours ago)
Lords ChamberMy Lords, although the Government’s amendments have been put forward as a signal of their determination to act, sadly they commit to nothing. They simply buy the Minister a bit more time and the opportunity, at some unknown moment in the future, to push through a compromise half-measure with minimal parliamentary scrutiny. I am appalled at this thought on this crucial issue. The Government are asking Peers to take a gamble on our children’s safety. They are placing their faith in a consultation that delivers nothing but more and more delay.
Regulating social media companies and keeping our children safe online are among the most defining challenges of our time. That is why we should vote for the cross-party amendment from the noble Lord, Lord Nash, which would raise the age to 16 within 12 months for the most harmful platforms—to be written into law before the summer. It is the safest option for our children at this time.
The Government’s complex, 62-question consultation is heavily framed towards the positive benefits of social media rather than towards the horrific harms which front-line professionals report every single day. On age assurance, the perceived downside is emphasised over obvious benefits. There is no clear process for managing conflicts of interest within the technology industry. How can this consultation be trusted? Reliable findings are precisely what this issue demands.
It is also worrying that the Government have introduced a Henry VIII clause which would give sweeping powers via secondary legislation, leaving little or no opportunity for this House to consider or scrutinise such measures. It would mean that the Government could dodge any scrutiny of their ultimate choice. This cannot be allowed to happen, because we would not be able to amend it. We would be able only to accept or reject it in full.
We are gambling with our children’s lives. That is why I strongly believe that the cross-party amendment in the name of the noble Lord, Lord Nash, is the safest, most common-sense option. We must not forget that every single day that we delay, more harms are done to the nation’s children. Do we want that? Their mental and physical well-being are under relentless attack. Let us not delay but do what we can to prevent this attack happening as soon as possible. I urge the Government to accept this amendment.
My Lords, Motion G2 is in my name. I shall speak also to all the other amendments in this group.
I think we have acknowledged that everybody in this House wishes to protect children, but there is a vast difference of opinion in respect of our approach and the Government’s sense of urgency. If I understood the Minister’s argument in setting out the Government’s position, it was that Ofcom would take responsibility and that it had sufficient powers. Many of us were in this Chamber earlier when the chasm between Ofcom’s powers on paper and its ability to impact on survivors was laid bare. If people do not feel the impact of the law, and if the lived experience of children and the ability of parents to get help are not properly impacted, the law has failed. This is central to the problem and to the debate that we are having here tonight.
I think the House knows that I prefer to speak not of banning children but of banning products which are poorly designed and unsafe to have access to our children. That may appear to be a subtle point, but it is hugely important, because access to children must be conditional on treating them fairly and safely. Equally, many of us would like to see age-appropriate services, designed by companies with children in mind, be available to children. Motion G2 sets out that conditionality. Experts and campaigners across the sector contributed to its drafting—in short form, it is what we want from government. Frankly, it is what the Government promised when in opposition.
Since we last debated this issue, barely two months ago, researchers found that AI chatbots are becoming one of the most dangerous technologies for promoting violence against women and girls. The Internet Watch Foundation reported a staggering 26,000% increase last year in the number of AI-generated child sexual abuse materials. Specialist police email me to alert me to offenders using TikTok’s virtual gift system to incentivise children to perform sexual or compromising acts. Alexa+ has arrived in the UK, despite American parents raising their concerns about very young children being lulled into close friendships and about inappropriate language, including it asking to look at a child’s underwear. While we consult, children are harmed in real time. We cannot afford to wait.
(2 years, 11 months ago)
Lords ChamberMy Lords, I speak in support of these amendments with hope in my heart. I thank the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, for leading the charge with such vigour, passion and determination: I am with them all the way.
The Government have said that the purpose of the Bill is to protect children, and it rests on our shoulders to make sure it delivers on this mission. Last week, on the first day in Committee, the Minister said:
“Through their duties of care, all platforms will be required proactively to identify and manage risk factors associated with their services in order to ensure that users do not encounter illegal content and that children are protected from harmful content. To achieve this, they will need to design their services to reduce the risk of harmful content or activity occurring and take swift action if it does.—[Official Report, 19/4/23; cols. 274-75.]
This is excellent and I thank the Government for saying it. But the full range of harms and risk to children will not be mitigated by services if they do not know what they are expected to risk-assess for and if they must wait for secondary legislation for this guidance.
The comprehensive range of harms children face every day is not reflected in the Bill. This includes sexual content that does not meet the threshold of pornography. This was highlighted recently in an investigation into TikTok by the Telegraph, which found that a 13 year-old boy was recommended a video about the top 10 porn-making countries, and that a 13 year-old girl was shown a livestream of a pornography actor in her underwear answering questions from viewers. This content is being marketed to children without a user even seeking out pornographic content, but this would still be allowed under the Bill.
Furthermore, high-risk challenges, such as the Benadryl and blackout challenges, which encourage dangerous behaviour on TikTok, are not dealt with in the Bill. Some features, such as the ability of children to share their location, are not dealt with either. I declare an interest as vice-president of Barnardo’s, which has highlighted how these features can be exploited by organised criminal gangs that sexually exploit children to keep tabs on them and trap them in a cycle of exploitation.
It cannot be right that the user-empowerment duties in the Bill include a list of harmful content that services must enable adults to toggle off, yet the Government refuse to produce this list for children. Instead, we have to wait for secondary legislation to outline harms to children, causing further delay to the enforcement of services’ safety duties. Perhaps the Minister can explain why this is.
The four Cs framework of harm, as set out in these amendments, is a robust framework that will ensure service risk assessments consider the full range of harms children face. I will repeat it once again: childhood lasts a lifetime, so we cannot fail children any longer. Protections are needed now, not in years to come. We have waited far too long for this. Protections need to be fast-tracked and must be included in the Bill. That is why I fully support these amendments.