Children’s Wellbeing and Schools Bill

Debate between Lord Tarassenko and Baroness Spielman
Thursday 18th September 2025

(2 weeks, 2 days ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Spielman Portrait Baroness Spielman (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have listened to a number of Lords speak movingly and wisely about the risks, concerns and things we need to guard against in the use of technology. I want to talk about the risk to learning itself. I have forgotten their name, but somebody recently wrote an excellent piece that illustrated this very vividly.

We all understand that when we send our children to school and when we teach them, the point is not simply for them to have a thing they can say they have done; it is the process that they go through that really embeds it and enables them to use that knowledge and those skills in future.

We have all seen it in the kinds of problems that have arisen with coursework. If a coursework essay or a homework assignment is produced for a child or university student by AI, then that child or student has not done the thinking, they have not learned what the assignment was set for, and the education will not achieve its purpose. There is a real risk at the moment that a lot of education in a lot of places is being quite significantly undermined because young people do not recognise that they are harming themselves by taking the shortcuts. Perhaps we have all been a little bit slow to recognise this risk.

There is a helpful distinction to be made here. I recently read a piece which distinguished between cognitive offloading and cognitive bypasses. The use of assistive technology, such as that which the noble Lord, Lord Addington, has referred to on occasion, might be described as cognitive offloading, where the point is to help the child with the additional challenges they are experiencing without losing the point of the lesson or what they are meant to be learning.

If we get to the point where the technology becomes a way of simply bypassing the learning, we are actually destroying education. The enthusiasm for technology—which has understandably invigorated us all; there are clearly tremendous opportunities—and the incredible energy and power of the tech firms, which of course concentrate immense efforts on Ministers to bring their products and services into schools, mean that there is a massive job for government to do to find that balance and to really understand the risks, not just around data and to children’s well-being but to education.

Lord Tarassenko Portrait Lord Tarassenko (CB)
- View Speech - Hansard - -

My Lords, I support Amendments 493, 494, 502K and 502YI, as someone with an interest in the use of educational technologies, including AI, both in schools and universities. I declare my interest as chair of the Maths Horizons project, funded by XTX Markets, which earlier this year reviewed the maths curriculum in England from five to 18, and briefly investigated the use of edtech to support the teaching of the subject.

I speak as a supporter of the deployment of educational technology in the classroom as I believe it can and should have a positive impact on the education of children, and not just in maths. But this must be done within a framework which protects children from its misuse. We must balance innovation in education through edtech with appropriate regulation. The regulations listed in subsection (2) of the proposed new clause in Amendment 493 would support the adoption of edtech in our schools rather than hinder it.

In this context, what has happened with chatbots based on large language models is a salutary example of the early release of AI products without proper safeguards, especially with respect to their use by children. Tragically, this week the parents of the American teenager who recently took his own life after repeatedly sharing his intentions with ChatGPT told a Senate judiciary sub-committee investigating chatbot dangers:

“What began as a homework helper gradually turned itself into a confidant and then a suicide coach”.


Ironically, we are now told that OpenAI is building a ChatGPT for teenagers and plans to use age-prediction technology to help bar children under 18 from the standard version. Sam Altman, the CEO of OpenAI, wrote in a blog this week just before the Senate hearings—and then coming to this country—that AI chatbots are

“a new and powerful technology, and we believe minors need significant protection”.

The risks associated with the use of edtech may not be on the same scale, but they are nevertheless real. In many cases, edtech products used in schools rely extensively on the collection of children’s data, allowing it to be used for commercial and profiling purposes. The recent report from the 5Rights Foundation and the LSE, which has already been mentioned, highlights that some popular classroom AI apps track users with cookies from adult websites and may provide inaccurate and unhelpful information. Most worryingly, a popular app used for educational purposes in the UK generates emulated empathy through sentiment analysis and so increases the likelihood of children forming an emotional attachment to the app. I therefore support Amendments 493, 494 and 502K, which together would ensure that edtech products provide children with the higher standard of protection afforded by the ICO’s age-appropriate design code.

In addition to the safeguards introduced by these amendments, there is a need for research to establish whether educational technologies deliver better educational outcomes for children. Most edtech products lack independent evidence that they lead to improved outcomes. Indeed, some studies have shown that edtech products can promote repetitive or distracting experiences with minimal, if any, learning values. By contrast, there is a growing body of evidence on the positive side that edtech can effectively support vocabulary acquisition, grammar learning, and the development of reading and writing skills for students for whom English is the second language, particularly when these tools are used to complement a teacher’s instruction.

To establish a causal relationship between the use of an edtech tool and a specific learning outcome, we need to design randomised control trials—the gold standard for demonstrating the efficacy of interventions in the social or medical sciences. Longitudinal data will then be needed to track student usage, time on task and completion rates. Crucially, the trial must have enough participants to detect a meaningful effect if one exists. This is unlikely to be possible using the data from a single school, so data from several schools will need to be anonymised and then aggregated to obtain a statistically meaningful result.

I am satisfied that Amendments 502K and 502YI would allow this methodological approach to be followed. Indeed, subsection (4)(c) of the proposed new clause in Amendment 502K would ensure that the code of practice enabled the development of standards to certify evidence-based edtech products and support the testing of novel products. This would provide UK- based companies with the opportunity to innovate in edtech within an appropriate regulatory environment.

As English is the lingua franca of the digital world, there is the opportunity for the UK to become a leader in edtech innovation and certification, for the benefit of children not only in the UK but in many other countries. These amendments should be seen by the Department for Education not as an attempt to overregulate the edtech sector but instead as a mechanism for the promotion of existing evidence-based apps and the development of a new generation of products, some of which may be AI-facilitated, using—no pun intended—best-in-class trial methodology.