Age Assurance (Minimum Standards) Bill [HL] Debate
Full Debate: Read Full DebateLord Sharpe of Epsom
Main Page: Lord Sharpe of Epsom (Conservative - Life peer)Department Debates - View all Lord Sharpe of Epsom's debates with the Department for Digital, Culture, Media & Sport
(3 years, 1 month ago)
Lords ChamberMy Lords, I start by acknowledging the many colleagues who were unable to speak today but who wrote to offer their support and I thank those who are present. I know that many noble Lords have made a considerable effort to be here and I look forward to their contributions.
In a moment, I will set out the Age Assurance (Minimum Standards) Bill, what it does and why it is so urgently needed, but before that I wish to say why I am here. In doing so, I declare my interests as set out in the register, particularly as chair of the 5Rights Foundation, which works to build the digital world that children deserve, and most recently as a member of the advisory council for the University of Oxford’s Institute for Ethics in AI and the Draft Online Safety Bill Joint Committee. My work means that, day in, day out, I see children being made the collateral damage of tech industry norms.
During the first lockdown, I was asked to visit a distraught head teacher whose pupil had received and shared a video of child sexual abuse so grotesque that I cannot describe it here. By the time we sat helplessly crying in the freezing playground, the video had been shared across London and was making its way up north. It was a primary school; the children involved were not even 10. I was with a child at the very moment it dawned on her that she had been groomed. Her so-called friend for whom she had performed intimate acts had filmed and shared videos of her with a network of adults thousands of miles away. Her spirit shattered in front of me.
Earlier this year, 5Rights published research showing that accounts registered as children were being targeted with material that no business should recommend to a child: emaciated bodies, violent misogynistic pornography, razor blades and gaping wounds, and even a message saying, “End it all”—and, sadly, some do. My inbox is a testimony to the grief and rage of bereaved parents who do not accept these norms that we are willing to tolerate or even justify as a cost of connectivity and innovation.
I am adamant that we do not use age assurance to dumb down the internet, invade privacy or lock children out of the digital world—it is essential for their growth and participation in our collective future. But it is failure writ large that children are routinely exposed to material and experiences that they do not choose or have the capacity to navigate. It is in the name of these children and their parents that I am here.
Age assurance is a misunderstood term. For the record, it is any system that purports to estimate or verify the age or age range of a user. The Bill is extremely narrow. It requires Ofcom to produce a code of conduct that sets out minimum standards for any system of age assurance. These are not technical standards. The Bill is technology-neutral but requires all services that provide or use age assurance to ensure that it is privacy preserving, proportionate and rights respecting. Specifically, it determines that age assurance be effective. Ofcom figures show that almost half of children in the UK between the ages of five and 12 are on social media, despite most sites having a minimum joining age of 13.
The Bill will ensure that any age-assurance system protects the privacy of users by taking no more information than is needed to establish age and not using that information for any other purpose, and that age assurance will be secure. If it is to be trusted, the storage, traceability and disposal of data must be subject to transparent and measurable standards. It provides that age assurance is proportionate to risk. It would be foolish to require a child to present their passport to explore the world of Peppa Pig, but it remains a travesty that 80% of pornography sites have no form of age barrier at all—not even a tick box.
The Bill will ensure that age assurance is appropriate to the capacity and age of the child, anticipating that some children will lie about their age. Equally, it will provide appropriate mechanisms for users to challenge decisions. If a child’s age is wrongly determined or an adult is mistaken for a child, there must be robust routes to redress. The Bill demands that age assurance be compatible with data legislation. I have spent the last four years working to ensure that the age-appropriate design code offers children a high bar of data protection in the certain knowledge that data protection makes children safer. That is why privacy is at the heart of this Bill. But as technology changes and we enter a world of virtual and alternate realities, as envisaged by Facebook’s punt on the metaverse, data law will change. Any regulation must keep one eye on the future.
Let me be utterly clear: age assurance is not a silver bullet that will fix all the ills of the digital world, but without it we cannot deliver the promises or protections to children that we have already made, neither to those underage nor to those between 13 and 17 who are so poorly protected in a digital world that treats over-13s as adults.
Nor is this a choice between user privacy and child safety. The sector’s enormous wealth is predicated on having a detailed knowledge of its users. As one child said to me, “How do they know I like red Nike trainers, but they don’t know I’m 12?” It is convenient to know that a child likes Nike trainers, because that drives advertising revenue. However, it is inconvenient to know that he is 12, because if you know that, why on earth is your algorithm recommending dating apps or extreme content, or sending him messages that suggest that he kill himself?
The Bill does not prescribe the technology that companies should use. AI, image or speech analysis, parental controls, cross-counter authentication, know your customer checks, capacity testing and age tokens from trusted partners all have a place. What we do not have is rules of the road, resulting in every provider, business or social media company making them up to suit itself.
When the Minister stands up, I anticipate that he will say that the department is working on a voluntary standard for age-assurance providers, but a voluntary standard needs volunteers. It will simply make the good guys better and leave the bad guys as they are. He may also be tempted to suggest that the online safety Bill will deal with this. I have read the Bill many times and there is no mandatory code of conduct for age assurance. Even if Parliament insists, which I believe it will, the earliest that a code could be operational by this route is 2024. The Digital Economy Act brought age verification for commercial pornography into law in 2017, but this is yet to be implemented. A child who was 11 in 2017 will be an adult by 2024.
Perhaps the Minister will say that this modest Bill is too broad as it touches on privacy, which is the domain of the ICO. This profoundly misunderstands the construction of the digital world. Data is the engine of the attention economy. Not only age assurance, but many of the codes in the online safety Bill, will require co-regulation, or they will fail.
I thank the noble Lord the Minister, the Minister for Technology and Innovation and the Secretary of State for their time. I wish to make it clear that I do not doubt that we all agree on the destination. But this is an issue that has concerned us for a decade. Legislation has been in place for four years and promises have been made by six Secretaries of State. Meanwhile, every day, children pay the price of a wilfully careless industry, sometimes with their lives.
If, when the Minister answers, he is able to give the Government’s full support, he has my deepest apologies for anticipating his words wrongly. However, on the basis that he will not, I ask him to answer this challenge. Which one of the 50% of 10 year-olds, approximately 400,000 children, currently on TikTok does not deserve immediate protection? Which one of the children receiving child sexual abuse material so horrific that it cannot be forgotten or unseen in a lifetime does not deserve immediate protection? How many children does he anticipate will be radicalised, grow into violent sexual norms, lose sleep or confidence or be pushed into cycles of self-harm and depression during the two years that the Government are willing to delay, when Ofcom has the staff and money, and could have the mandate, right now? What would he say to the parents of Molly and Frankie and other parents who have lost children about putting at risk even one more child who could be given a service suitable for their age? Which Facebook whistleblower revelations or headlines about Instagram, OnlyFans, YouTube, TikTok and Omegle—to name just a few—have given the Government confidence that industry will meanwhile do the right thing?
I ask the House not to amend the Bill, in order to give mandatory privacy-preserving, trusted age assurance a swift passage. I say to the Government: this Bill deserves more than their sympathy; it deserves their support. I beg to move.
My Lords, I respectfully remind noble Lords of the advisory speaking time of six minutes. The House has a lot of business today. Thank you.