Data Protection and Digital Information Bill Debate
Full Debate: Read Full DebateBaroness Kidron
Main Page: Baroness Kidron (Crossbench - Life peer)Department Debates - View all Baroness Kidron's debates with the Department for Science, Innovation & Technology
(11 months, 1 week ago)
Lords ChamberMy Lords, I declare my interests set out in full on the register, including as an advisor to the Institute for Ethics in AI at Oxford University, chair of the Digital Futures for Children centre at the LSE and chair of the 5Rights Foundation. I add my welcome to my noble friend Lord de Clifford, who I had the pleasure of meeting yesterday, and I look forward to his maiden speech.
I start by quoting Marcus Fysh MP who said in the other place:
“this is such a serious moment in our history as a species. The way that data is handled is now fundamental to basic human rights … I say to those in the other place as well as to those on the Front Benches that … we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right”.—[Official Report, Commons, 29/11/23; col. 878.]
He was not the only one on Report in the other place who was concerned about some of the provisions in the Bill, who bemoaned the lack of scrutiny and urged the Government to think again. Nor was he the only one who reluctantly asked noble Lords to send the Bill back to the other place in better shape.
I associate myself with the broader points made by both noble Lords who have already spoken—I do not think I disagreed with a word that they said—but my own comments will primarily focus on the privacy of children, the case for data communities, access for researchers and, indeed, the promises made to bereaved parents and then broken.
During the passage of the Data Protection Act 2018, your Lordships’ House, with cross-party support, introduced the age appropriate design code, a stand-alone data protection regime for the under-18s. The AADC’s privacy by design approach ushered in a wave of design change to benefit children: TikTok and Instagram disabled direct messaging from unknown adults to children; YouTube turned off auto-play; Google turned on safe search on by default for children; 18-plus apps were taken out of the Play Store; TikTok stopped notifications through the night; and Roblox stopped tracking and targeting children for advertising. These were just a handful of hundreds of changes to products and services likely to be accessed by children. Many of these changes have been rolled out globally, meaning that while other jurisdictions cannot police the code, children in those places benefit from it. As the previous Minister, the noble Lord, Lord Parkinson, acknowledged, it contributes to the UK’s reputation for digital regulation and is now being copied around the globe.
I set this out at length because the AADC not only drove design change, it also established the crucial link between privacy and safety. This is why it is hugely concerning that children have not been explicitly protected from changes that lessen user data protections in the Bill. I have given Ministers notice that I will seek to enshrine the principle that children have the right to a higher bar of data protection by design and default; to define children’s data as sensitive personal data in the Bill; and exclude children from proposals that risk eroding the impact of the AADC, notably in risk assessments, automated processing, onward processing, direct marketing and the extended research powers of commercial companies.
Minister Paul Scully said at Second Reading in the other place:
“We are committed to protecting children and young people online … organisations will still have to abide by our Age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
I take it from those words that any perception of, or diminution to, children’s data rights is inadvertent, and it remains the Government’s policy not to weaken the AADC as currently configured in the Bill. Will the Minister confirm that it is indeed the Government’s intention to protect the AADC and that he is willing to work with me to ensure that it is that the outcome? I will also seek a requirement for the ICO to create a statutory children’s code in relation to AI. The ubiquitous deployment of AI technology to recommend and curate is nothing new, but the rapid advances in generative AI capabilities marks a new stage in its evolution. In the hundreds of pages of the ICO’s non-binding Guidance on AI and Data Protection, its AI and Data Protection Risk Toolkit and its advice to developers on generative AI, there is but one mention of the word “child”—in a case study about child benefit.
The argument made was that children are covered by the AADC, which underlines again just how consequential it is. However, since adults are covered by data law but it is considered necessary to have specific AI guidance, the one in three users that is under 18 deserves the same consideration. I am not at liberty to say today, but later this week—perhaps as early as tomorrow—information will emerge that underlines the urgent need for specific consideration of children’s safety in relation to generative models. I hope that the Minister will agree that an AI code for kids is an imperative rather than nice to have.
Similarly, we must deliver data privacy to children in education settings. Given the extraordinary rate at which highly personal data seeps out of schools into the commercial world, including to gambling companies and advertisers, coupled with the scale of tech adoption in schools, it is untenable to continue to see tech inside school as a problem for schools and tech outside school as a problem for regulators. The spectre of a nursery teacher having enough time and knowledge to integrate the data protection terms of a singing app, or the school ICT lead having to tackle global companies such as Google and Microsoft to set the terms for their students’ privacy, is frankly ridiculous, but that is the current reality. Many school leaders feel abandoned by the Government’s insistence that they should be responsible for data protection when both the AADC and Online Safety Act have been introduced but they benefit from neither. It should be the role of the ICO to set data standards for edtech and to ensure that providers are held to account if they fall short. As it stands, a child enjoys more protection on the bus to school than in the classroom.
Finally on issues relating to children, I want to raise a technical issue around the production of AI-generated child sexual abuse material. I recognise the Government’s exemplary record on tackling CSAM but, unfortunately, innovation does not stop. While AI-generated child sexual abuse content is firmly in scope of UK law, it appears that the models or plug-ins trained on generating CSAM or trained to generate CSAM are not. At least four laws, the earliest from 1978, are routinely used to bring criminal action against CSAM and perpetrators of it, so I would be grateful if the Minister would agree to explore the issue with the police unit that has raised it with me and make an explicit commitment to close any gaps identified.
We are at an inflection point, and however esoteric and arcane the issues around data appear to be, to downgrade a child’s privacy even by a small degree has huge implications for their safety, identity and selfhood. If the Government fail to protect and future-proof children’s privacy, they will be simply giving with one hand in the OSA and taking away with the other in this Bill.
Conscious that I have had much to say about children, I will briefly put on the record issues that we can debate at greater length in Committee. While data law largely rests on the assumption of a relationship between an individual and a service, we have seen over a couple of decades that power lies in having access to large datasets. The Bill offers a wonderful opportunity to put that data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes. I have been inspired by approaches coming out of academia and the third sector which have supported the drafting of amendments to find a route that would enable the sharing of data rights.
Similarly, as the noble Lord, Lord Knight, said, we must find a route to access commercial data sets for public interest research. I was concerned that in the other place when former Secretary of State Jeremy Wright queried why a much-touted research access had not materialised in the Bill, the Minister appeared to suggest that it was covered. The current drafting embeds the asymmetries of power by allowing companies to access user data, including for marketing and creating new products, but does not extend access for public interest research into the vast databases held by those same companies. There is a feeling of urgency emerging as our academic institutions see their European counter- parts gain access to commercial data because of the DSA. There is an increased need for independent research to support our new regulatory regimes such as the Online Safety Act. This is an easy win for the Government and I hope that they grasp it.
Finally, I noted very carefully the words of the Minister when he said, in relation to a coroner’s access to data, that the Secretary of State had made an offer to fill the gap. This is a gap that the Government themselves created. During the passage of the Online Safety Act we agreed to create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child. The Government have reneged by narrowing the scope to those children taking their own life. Expert legal advice says that there are multiple scenarios under which the Government’s narrowing scope creates a gaping hole in provision for families of murdered children and has introduced uncertainty and delay in cases where it may not be clear how a child died at the outset.
I must ask the Minister what the Government are trying to achieve here and who they are trying to please. Given the numbers, narrowing scope is unnecessary, disproportionate and egregiously inhumane. This is about parents of murdered children. The Government lack compassion. They have created legal uncertainty and betrayed and re-traumatised a vulnerable group to whom they made promises. As we go through this Bill and the competition Bill, the Minister will at some points wish the House to accept assurances from the Dispatch Box. The Government cannot assure the House until the assurances that they gave to bereaved parents have been fulfilled.
I will stop there, but I urge the Minister to respond to the issues that I have raised rather than leave them for another day. The Bill must uphold our commitment to the privacy and safety of children. It could create an ecosystem of innovative data-led businesses and keep our universities at the forefront of tech development and innovation. It simply must fulfil our promise to families who this Christmas and every other Christmas will be missing a child without ever knowing the full circumstances surrounding that child’s death. That is the inhumanity that we in this House promised to stop—and stop it we must.