(2 weeks, 5 days ago)
Lords ChamberMy Lords, I shall speak also to Amendment 494. I am grateful to the noble Baroness, Lady Kidron, who signed my amendment; I will give positive support to her amendment in this group.
Educational technology—edtech—offers extraordinary opportunities for learners right through the school and education experience. In effect, it enables personalised education—for every young person to have a classroom assistant alongside them in technology form. It is an extraordinary upside and transformational, but only if we get right the framework, the construction and the underpinning principles that guide it. If we human-lead with these technologies, we will give ourselves the best opportunity to succeed and to empower all children and young people to succeed in their education journey. If we have a principles-based, outcomes-focused and inputs-understood approach, we enable, we empower and we have a clear understanding of what we require from these edtech solutions.
I turn now to the amendment. All edtech must be inclusive by design; accessible; transparent about the make-up of the technology; labelled, if AI is in the mix; and absolutely crystal clear as to how the data is used, where it is stored and how none of that data—children’s data—gets sold on to any third parties.
The opportunities are extraordinary. It is at least a touch unfortunate that so much of technology in school is being described and seen through the lens of smartphones. It is understandable, because of some of the catastrophic downsides and outcomes we have seen as a consequence, but there is nothing inevitable about that. Edtech, positively deployed, human-led, with human principles and values at its heart, and with the right oversight and approach to data, could enable such a powerful learning experience, primarily for young people and children but also for teachers, classroom assistants and the whole school community.
Amendment 494 is about pulling on the power that we have through procurement. We can achieve so much by understanding how we look at the values and underpinning principles that we put into how we procure. This amendment echoes many of the under- pinnings of Amendment 493 in understanding that, if we can get a procurement standard in place, then many of the potential problems and difficulties are dealt with before they even come into being, because of that standard being so well set before any consideration has been given to making a purchase of any edtech.
I look forward to other contributions from noble Lords and the Minister’s response. I beg to move.
My Lords, in speaking to my Amendments 502K, 502YI and 502YH, I also register my support for Amendments 493 and 494 in the name of the noble Lord, Lord Holmes, and, more broadly, to associate myself with everything he has just said. Amendment 502YI calls for a code of practice for education data. I tabled a similar amendment to the Data (Use and Access) Bill earlier this year and was given an assurance from the Minister, the noble Lord, Lord Vallance, who gave me
“a firm commitment … that the Government will use powers under the Data Protection Act 2018 to require the ICO to publish a new code of practice addressing edtech issues”.—[Official Report, 28/1/25; col. 148.]
A letter I received from the department in anticipation of today’s debate suggested that the Government are “reviewing and considering”. I ask the Minister whether we are reviewing and considering the firm commitment that was made nine months ago.
We have been discussing data protection in schools since 2017 and we have had multiple promises from both department and regulator that have yet to bear fruit. Yet the Government are pressing ahead to introduce new data-hungry technology in our schools. The uses of pupils’ data are well evidenced and egregious. Some of it has ended up on adult sites and gambling sites, which is an abuse of children’s privacy.
Pupils are, first and foremost, children. They are not critical sources of data for commercial enterprise. It is beyond time to act. I ask the Minister to accept the amendment so that this Bill is the one that finally sets out the scope and timescale for a data regime that delivers children the protection they deserve when they are at school.
I turn to Amendment 502K. I wish to be very clear that I, too, welcome the potential of technology to contribute to learning and well-being at schools, but while the Secretary of State Bridget Phillipson has heralded a
“new technological era to modernise our education system”,
there is as yet no corresponding binding commitment to ensure that the technology being introduced at pace actually works. The Education Endowment Foundation has said that gains are often very small and has warned that edtech may be a “gap-widener” for socioeconomically disadvantaged students. A 2023 DfE survey found that fewer than half of teachers thought that technology improved pupil attainment, and UNESCO referred to the use of edtech as a “tragedy”, and the results from the huge global investment in edtech during the pandemic as “far from clear”.
(9 months, 3 weeks ago)
Grand CommitteeMy Lords, in speaking to Amendment 137 in my name I thank the noble Baroness, Lady Harding, the noble Lord, Lord Stevenson, and my noble friend Lord Russell for their support. I also add my enthusiastic support to the amendments in the name of my noble friend Lord Colville.
This is the same amendment that I laid to the DPDI Bill, which at the time had the support of the Labour Party. I will not labour that point, but it is consistently disappointing that these things have gone into the “too difficult” box.
Amendment 137 would introduce a code of practice on children and AI. AI drives the recommender systems that determine all aspects of a child’s digital experience, including the videos they watch, their learning opportunities, the people they follow and the products they buy—and, as reported last weekend, AI is even helping farmers pick the ripest tomatoes for baked beans. But it no longer concerns simply the elective parts of life where, arguably, a child or a parent on their behalf can choose to avoid certain products and services. AI is invisibly and ubiquitously present in all areas of their lives, and its advances and impact are particularly evident in the education and health sectors, the first of which is compulsory for children and the second of which is necessary for all of us.
The amendment has three parts. The first requires the ICO to create a code and sets out the expectations of its scope; the second considers who and what should be consulted and considered, including experts, children, and the frameworks that codify children’s existing rights; and the third part defines elements of the process, including risk assessment definitions, and sets out the principles to which the code must adhere.
When we debated this before, I anticipated that the Minister would say that the ICO had already published guidance, that we do not want to exclude children from the benefits of AI, and that we must not get in the way of innovation. Given that the new Government have taken so many cues from the previous one, I am afraid I anticipate a similar response.
I first point out, therefore, that the ICO’s non-binding guidance on AI and data protection is insufficient. It has only a single mention of a child in its 140 pages, which is a case study about child benefits. In the hundreds of pages of guidance, toolkits and sector information, nowhere are the specific needs and rights, or development vulnerabilities, of children comprehensively addressed in relation to AI. This absence of children is also mirrored in government publications on AI. Of course, we all want children to enjoy the benefits of AI, but consideration of their needs would increase the likelihood of those benefits. Moreover, it seems reckless and unprincipled not to protect them from known harms. Surely the last three decades of tech development have shown us that the experiment of a “build first, worry about the kids later—or never” approach has cost our children dearly.
Innovation is welcome but not all innovation is equal. We have bots offering 13 year-olds advice on how to seduce grown men, or encouraging them to take their own lives, edtech products that profile children to unfair and biased outcomes that limit their education and life chances, and we have gen AI that perpetuates negative, racist, misogynist and homophobic stereotypes. Earlier this month, the Guardian reported a deep bias in the AI used by the Department for Work and Pensions. This “hurt first, fix later” approach creates a lack of trust, increases unfairness, and has real-world consequences. Is it too much to insist that we ask better questions of systems that may result in children going hungry?
Why children? I am saddened that I must explain this, but from our deeply upsetting debate last week on the child protection amendments, in which the Government asserted that children are already catered for while deliberately downgrading their protections, it seems that the Government or their advisers have forgotten.
Children are different for three reasons. First, as has been established over decades, children are on a development journey. There are ages and stages at which children are developmentally able to do certain things, such as walk, talk, understand risk and irony and learn different social skills. There are equally ages and stages at which they cannot do those things. The long-established consensus is that families, social groups and society more broadly, including government, step in to support them on this journey. Secondly, children have less voice and less choice about how and where they spend their time, so the places and spaces they inhabit have to be designed to be fit for childhood. Thirdly, we have a responsibility towards children that extends even beyond our responsibility to each other. This means that we cannot legitimatise profit at their expense. Allowing systems to play in the wild in the name of growth and innovation, leaving kids to pay the price, is a low bar.
It is worth noting that since we debated it, a proposal for this AI code for children that follows the full life cycle of development, deployment, use and retirement of AI systems has been drafted and has the support of multiple expert organisations and individuals around the globe. I am sure that all nations and intergovernmental organisations will have additional inputs and requirements, but it is worth saying that the proposed code, which was written with input from academics, computer scientists, lawyers, engineers and children’s rights activists, is mindful of and compatible with the EU AI Act, the White House Blueprint for an AI Bill of Rights, the Executive Order on the Safe, Secure and Trustworthy Development and Use of Artificial Intelligence, the Council of Europe’s Framework Convention on Artificial Intelligence and, of course, the UNCRC general comment no. 25.
This proposal will be launched early next year as an indication of what could and should be done. Unless the Government find their compass vis-à-vis children and tech, I suspect that another jurisdiction will adopt it ahead of the UK, making that the go-to destination for trusted tech development for child-safe products. It is perhaps worth reminding the Committee that one in three connected people is under 18, which is roughly 1 billion children. As the demographics change, the proportion and number of children will rise. It is a huge financial market.
Before I sit down, I shall briefly talk about the AADC because sometimes Ministers say that we already have a children’s code. The age-appropriate design code covers only ISS, which automatically limits it, and even the ICO by now agrees that its enforcement record is neither extensive nor impressive. It does not clearly cover the urgent area of edtech, which is the subject of another amendment, and, most pertinently to this amendment, it addresses AI profiling only, which means that it is limited in how it can look at the new and emerging challenges of generative AI. A revamp of the AADC to tackle the barriers of enforcement, account for technological advances, cover all products and services likely to be accessed by children and make our data regime AI-sensitive would be welcome, but rather than calling for a strengthening of the AADC, the ICO agreed to the downgrading of children’s data protection in the DPDI Bill and, again, has agreed to the downgrading of protections in the current Bill on ADM, scientific research, onward processing and so on. A stand-alone code for AI development is required because in this way we could be sure that children are in the minds of developers at the outset.
It is disappointing that the UK is failing to claim its place as the centre of regulated and trusted innovation. Although we are promised an AI Bill, the Government repeatedly talk of large frontier companies. AI is in every part of a child’s life from the news they read to the prices they pay for travel and goods. It is clear from previous groups that many colleagues feel that a data Bill with no AI provisions is dangerous commercially and for the communities of the UK. An AI Bill with no consideration of the daily impact on children may be a very poor next choice. Will the Minister say why a Labour Government are willing to abandon children to technology rather than building technology that anticipates children’s rights and needs?
My Lords, it is a pleasure to follow my friend the noble Baroness, Lady Kidron, and to give full-throated support to my friend the noble Viscount, Lord Colville, on all his amendments. Given that the noble Baroness mentioned it and that another week has passed since we asked the Minister the question, will we see an AI Bill or a consultation before Santa comes or at some stage in the new year? I support all the amendments in this group and in doing so, as it is the first time I have spoken today in Committee, I declare my technology interests as set out in the register, not least as an adviser to Socially Recruited, an AI business.
I will speak particularly to my Amendment 211A. I have put down “image, likeness and personality” not because I believe that they stand as the most important rights that are being transgressed or that they are the most important rights which we should consider; I have put them down to give a specific focus on them because, right now, they are being largely cut across and ignored, so that all of our creatives find themselves with their works, but also image, likeness and personality, disappearing into these largely foundation AI models with no potential for redress.
Once parts of you such as your name, face or voice have been ingested, as the noble Lord, Lord Clement-Jones, said in the previous group, it is difficult then to have them extracted from the model. There is no sense, for example, of seeking an equitable remedy to put one back in the situation had the breach not occurred. It is almost “once in, forever in”, then works start to be created based on those factors, features and likenesses, which compete directly with the creatives. This is already particularly prevalent in the music industry.
What plans do the Government have in terms of personality rights, image and likeness? Are they content with the current situation where there is no protection for our great creatives, not least in the music industry? What does the Bill do for our creatives? I go back to the point made by the noble Baroness, Lady Kidron. How can we have all these debates on a data Bill which is silent when it comes to AI, and a product regulation Bill where AI is specifically excluded, and yet have no AI Bill on the near horizon—unless the Minister can give us some up-to-date information this afternoon? I look forward to hearing from her.