Data (Use and Access) Bill [Lords] Debate
Full Debate: Read Full DebateTonia Antoniazzi
Main Page: Tonia Antoniazzi (Labour - Gower)Department Debates - View all Tonia Antoniazzi's debates with the Department for Science, Innovation & Technology
(1 day, 22 hours ago)
Commons ChamberI thank the hon. Member for her intervention, and I will shortly come on to the impact on all people of the decision of the Supreme Court. Our new clause’s focus and scope are simple. The Supreme Court ruling made it clear that public bodies must collect data on biological sex to comply with their duties under the Equality Act. The new clause ensures that this data is recorded and used correctly in accordance with the law. This is about data accuracy, not ideology.
New clause 21 is based in part on the work of Professor Alice Sullivan, who conducted a very important review, with deeply concerning findings on inaccurate data collection and the conflation of gender identity with biological sex data. She found people missed off health screening, risks to research integrity, inaccurate policing records and management through the criminal justice system, and many other concerns. These concerns present risks to everyone, irrespective of biological sex, gender identity or acquired gender. Trans people, like everyone else, need health screening based on their biological sex. Trans people need protecting from sexual predators, too, and they have the right to dignity and respect.
The Sullivan report shows beyond doubt that the concerns of the last Government and the current Leader of the Opposition were entirely justified. The Government have had Professor Sullivan’s report since September last year, but the Department for Science, Innovation and Technology has still not made a formal statement about it or addressed the concerns raised, which is even more surprising given its relevance to this Bill. The correction of public authority data on sex is necessary and urgent, but it is made even more critical by the implementation of the digital verification services in the Bill.
I appreciate that the shadow Minister is making an important point on the Sullivan review and the Supreme Court judgment, but there are conversations in Government and with Labour Members to ensure that the Supreme Court judgment and the Sullivan review are implemented properly across all Departments, and I hope to work with the Government on that.
I thank the hon. Member for her intervention, and for all the work that she and colleagues on both sides of the House are doing in this area. I hope that the findings of the Sullivan report are implemented as soon as possible, and part of that implementation would be made possible if Members across the House supported our new clause.
For the digital verification services to be brought in, it is important that the data used to inform them is accurate and correct. Digital verification could be used to access single-sex services, so it needs to be correct, and if sex and gender data are conflated, as we know they are in many datasets, a failure to act will bring in self-ID by the back door. To be clear, that has never been the legal position in the UK, and it would conflict with the ruling of the Supreme Court. Our new clause 21 is simple and straightforward. It is about the accurate collection and use of sex data, and rules to ensure that data is of the right standard when used in digital verification services so that single-sex services are not undermined.
New clause 19 is on the Secretary of State’s duty to review the age of consent for data processing under the UK GDPR. What can or should children be permitted to consent to when using or signing up to online platforms and social media? How do we ensure children are protected, and how do we prevent harms from the use of inappropriate social media itself, separate from the content provided? How do we help our children in a world where social media can import the school, the playground, the changing room, the influencer, the stranger, the groomer, the radical and the hostile state actor all into the family home?
Our children are the first generation growing up in the digital world, and they are exposed to information and weaponised algorithms on a scale that simply did not exist for their parents. In government, we took measures to improve protections and regulate harmful content online, and I am delighted to see those measures now coming into force. However, there is increasing evidence that exposure to inappropriate social media platforms is causing harm, and children as young as 13 may not be able to regulate and process this exposure to such sites in a safe and proportionate way.
I am sure every Member across the House will have been contacted by parents concerned about the impact of social media on their children, and we recognise that this is a challenging area to regulate. How do we define and target risky and inappropriate social media platforms, and ensure that education and health tech—or, indeed, closed direct messaging services—do not fall within scope? How effective are our provisions already, and can age verification be made to work for under-16s? What ids are available to use? What will the impact of the Online Safety Act 2023 be now that it is coming into force? What are the lessons from its implementation, and where does it need strengthening? Finally, how do we support parents and teachers in educating and guiding children so they are prepared to enter the digital world at whatever age they choose and are able to do so?
The Government must take action to ensure appropriate safeguards are in place for our children, not through outright bans or blanket restrictions but with an evidence-based approach that takes into account the recent legal changes and need for effective enforcement, including age verification for under-16s. Too often in this place we focus on making more things illegal rather than on the reasons for lack of enforcement in the first place. There is no point in immediate restrictions if they cannot be implemented.