Children’s Wellbeing and Schools Bill

Lord Tarassenko Excerpts
Tuesday 3rd February 2026

(5 days, 17 hours ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Moved by
227: After Clause 63, insert the following new Clause—
“Register of software tools permitted in schools(1) Within six months of the day on which this Act is passed, the Secretary of State must prepare a register of software tools, including websites, which may be used by pupils for educational purposes in schools.(2) For their software to be listed on the register, a provider must—(a) ensure their software adheres to standards set out in—(i) the Age Appropriate Design Code,(ii) the Online Safety Act 2023,(iii) the ‘curriculum principles’ section of the final report of the 2025 Curriculum and Assessment Review, and(iv) any other standards of privacy or online safety which apply to educational settings, and(b) provide a helpline or adequate system for reporting any hazards, privacy breaches, or safety failures.(3) In establishing that the software meets the standards set out in subsection (2)(a), the Secretary of State must consult with—(a) experts in data protection and online safety,(b) educators,(c) curriculum and school representatives, and(d) any other parties the Secretary of State deems relevant.(4) Software tools included on the register must be whitelisted by school network firewall systems.”Member’s explanatory statement
This amendment seeks to ensure a minimum level of access to websites for students to learn about computer science and AI as part of their school education, by requiring the Secretary of State to prepare a register of suitable software tools which must be whitelisted - and therefore remain accessible - by school firewall systems.
Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - -

My Lords, Amendment 227 is in my name and that of the noble Baroness, Lady Kidron. We started with AI during Oral Questions what is now yesterday afternoon. We considered the use of AAI in the debate on Amendment 209 yesterday evening. We are now back with AI within edtech. Amendment 227 is about ensuring that a minimum level of provision of software tools, including websites, is available to every pupil in England, regardless of the school they attend.

Over the last six months, I have worked with Professor Peyton Jones from the University of Cambridge and the Raspberry Pi Foundation to develop proposals for a level 3 qualification in data science and AI. This is being done in consultation with the relevant team in the Department for Education.

Importantly, this level 3 qualification would not be just for those sixth-formers who will go on to read computer science at university but, first and foremost, for the professionals of the future, such as lawyers, economists and doctors. The aim is to give those pupils in the final two years of school sufficient knowledge and experience of up-to-date AI to enable them to use it properly in their time at university and at the start of their professional careers.

If the UK is to have a workforce ready to take advantage of the opportunities that AI offers, AI education needs to begin at school. I know that His Majesty’s Government recognise this. They have just published a set of standards which generative AI products should meet to be considered safe for users in educational settings. However, these are intended mainly for edtech developers and suppliers to schools and colleges, not schoolteachers and administrators.

During a workshop organised by the Raspberry Pi Foundation last November, I met teachers from all types of schools who were keen to learn more about a level 3 qualification in data science and AI. I soon discovered that IT departments in many schools today have a strict, if misguided, interpretation of the Online Safety Act. As far as they are concerned, the safest way to prevent pupils accessing harmful or inappropriate material while on school premises is to bar them from accessing any website, even and especially OpenAI’s. There are other schools, of course, where the staff in the IT department operate a more nuanced firewall policy.

This amendment seeks to ensure that there is an irreducible minimum set of software tools, including websites, which every pupil in any school in England will be able to access during the school day. Pupils should be prevented from accessing websites which may lead to harm, but they should instead have access to websites with strong educational missions; for example, Code.org or MathsWatch. These would be included in a register of software tools permitted in schools and whitelisted by the school network firewall system.

Schools would be free to add other websites if they wished to do so, but the amendment would ensure that all pupils in England had access to a minimum set of whitelisted software tools, enabling them to learn about data science and AI as part of their school education. I beg to move.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, Amendments 238 to 240 are in my name and those of my noble friends Lady Cass and Lord Russell. I support Amendment 227 in the name of my noble friend Lord Tarassenko. I start by thanking the Minister and her officials for the engagement that we have had since Committee. These amendments, unlike in the previous grouping, are all about a single thing: the uses of technology in our schools. I feel that they are long overdue; we have seen many of them before in our deliberations on the Data (Use and Access) Bill, as well as earlier in this Bill.

Less than a fortnight ago, the Secretary of State delivered a speech in which she said that we are in the middle of a technology revolution in education and that technology is moving so quickly that:

“The world of even 5 years ago is gone forever—already a lost, obsolete age”.

We are in a time of change, but I am very concerned that this uncritical view of tech is difficult for schools. The Secretary of State is dismissing long-standing educational practices, honed by experience and research, in favour of technology, some of which is proven to be unsafe and to invade privacy, and much of which has yet to be tested.

I will go through the amendments quickly. Amendment 238 would require the Secretary of State to prepare a statutory code of practice on the efficacy of educational technology within 18 months of the Act’s passing, and a certification scheme for minimum pedagogical standards for edtech procurement in schools. In December, the Minister wrote to me to say that the Government were developing a new approach to certify edtech products to make certain that they are safe and fit for purpose, through an accreditation service and statutory guidance. It seemed from the letter that she was referring to filtering and monitoring, which I will come to, but I would be grateful if she would clarify that when she responds.

The problem is that the process by which we are interrogating edtech is far slower than the process by which we are introducing it into our schools. Although I welcome the idea that the Government will test novel products and consult a wide group of people, unless I am mistaken, the regime does not offer a certification scheme that guarantees the learning outcomes of edtech.

It is for that reason that I also support my noble friend Lord Tarassenko’s Amendment 227. He and I have worked on a number of issues that seek to apply existing rules to technology to ensure that those who develop it consider the needs of individuals and communities into which it is deployed. Given that my noble friend has given a detailed explanation of his whitelist amendment, I will not reiterate it now, but I commend this amendment to the Government, because it is a model for how we should deal with edtech more broadly: insist on existing standards, make adherence visible and, in doing so, make a well-designed, private, positive use case for tech in schools. Without the existing standards, we cannot see what the edtech is doing.

Amendment 239 requires the Government to set statutory standards for filtering and monitoring systems used in schools. This amendment is marginally different from the one that I tabled in Committee, in that it clarifies adherence to data collection practices, that there is nothing in them that prevents staff carrying out their safeguarding duties, and that the standards would be checked with real-time tests established through a certification scheme with which Ofsted would check that schools complied.

I have been pressing this issue for over five years and yet we have failed to solve the problem. The introduction of generative AI means that we are going backwards and I believe that the Government have turned to guidance again: they have updated their filtering and monitoring standards only this month. I am pleased to see that that guidance now clarifies that barriers to illegal content must be switched on at all times and I believe that the Minister will also commit to consultation.

However, experts at the UK Safer Internet Centre suggest that seven of the 24 filtering and monitoring systems used in the UK do not currently meet the standards that filter for illegal content and only three of them currently provide clear evidence that they can analyse and block generative AI content in real time, as the new standards require. The same experts say that market compliance is uneven, that schools are dependent on providers’ self-assessments and that there is a serious gap between policy intent and consistent implementation. We need to remove the inconsistency, meet basic safety requirements and insist that they are routinely checked. It is not right that schools are left with the burden of working out what the system they have paid for does or does not do. I understand that many school leaders believe they comply with filtering and monitoring standards, but do not. I worry that the Government are overestimating compliance overall.

It is a tragedy that we are discussing this at midnight. This amendment should have been put in front of the House. I remind noble Lords who are in the Chamber or reading this in Hansard that Frankie Thomas lost her life, and her parents, who campaigned fiercely for these amendments, have for five years been told by Minister after Minister that this would be put right, and it still has not been. I ask the Minister to give me some hope that this will be put right in statute at the basic level we require and that experts are asking for. Obviously, there will be no vote this evening.

Finally, Amendment 240 would require the ICO to issue a code of practice for educational settings. On Report of the data Bill, the then Minister, the noble Lord, Lord Vallance, gave firm commitments that the Government would use their powers to require the ICO to publish a new code of practice. In Committee of this Bill, the Minister said the ICO was under a commitment to produce an edtech code of practice, but the Minister’s letter to me of 16 December said the Government will lay regulations in the second half of 2026 requiring the ICO to begin work on the edtech code. This is political snakes and ladders. I am back at the beginning. In the old world—which is gone for ever and obsolete—it was not doable that every movement, emotion and learning outcome of a child could be taken by a commercial company from school and pushed into the commercial world to be exploited.

Amendment 240—which I have been promised twice by two different Ministers—would set a clear time limit of six months after the Act’s passing within which an ICO code of practice for education must be established. As set out in the Minister’s letter, it will be more than 18 months from when Ministers first committed to it that it would be started. Can she speed that promise right up?

Each of these amendments asks the Government to set the standards so that tech can do the technology, the teachers can do the teaching and the children can flourish. Anything less is putting big tech ahead of children.

--- Later in debate ---
Baroness Smith of Malvern Portrait Baroness Smith of Malvern (Lab)
- Hansard - - - Excerpts

In those cases, I would expect every school thinking about its homework policy to have engaged with parents on the details of how that homework policy was going to work, but I think what was proposed by the noble Baroness in this amendment would limit the ability of schools to have those conversations and to make the decisions that were appropriate for them. It is on that basis that we are resisting it.

Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - -

Very briefly, given the time, I just want to reassure noble Lords, particularly the noble Lord, Lord Storey, that none of the amendments—not just mine—stops the use of edtech; they introduce rules for its development and introduction into schools. For example, the whitelist is an irreducible minimum to ensure that all students in schools in England would have access to this minimum set of tools. Of course, schools will be entirely free to add to the whitelist appropriate websites that they felt would help the educational attainment of their children. So it is not about stopping but enabling, through a minimum set of tools, a whitelist, and about schools being able, if they felt it was appropriate, to add to that whitelist.

Children’s Wellbeing and Schools Bill

Lord Tarassenko Excerpts
Wednesday 21st January 2026

(2 weeks, 4 days ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Bennett of Manor Castle Portrait Baroness Bennett of Manor Castle (GP)
- Hansard - - - Excerpts

My Lords, I have two original points to make that have not been covered at all. We should bring ourselves back to the fact that there is an enormous amount of agreement around this Chamber. I think everyone will say we feel enormous sympathy for the families, some of whom are here today, who have lost family as a result of contact with social media. We all accept that we want 16 year-olds on the day of their birthday to be able to stride out into the world confident, capable, ready to step into adulthood. Most of us want to rein in the overwhelmingly powerful digital companies which have been allowed to run wild across the world by political decisions made by adults. I particularly commend the right reverend Prelate for naming the spectre in the room—Donald Trump and his tech bro friends. He is a spectre here and is now recorded in Hansard.

I say to the noble Baroness, Lady Kidron, that we have very broad agreement that the Online Safety Act has been a total failure and Ofcom is not delivering what it should be doing. Those are the points of agreement. Where my conclusions drive me is that I would back Amendment 91 from the noble Baroness, Lady Penn, with some caveats, which I will get back to, but it is not my intention to vote for any of the ban amendments before us today. I have a great deal of sympathy with the Lib Dems’ brave effort to find a way through a middle road and the noble Baroness, Lady Kidron, almost swayed me that we should make a gesture. The case I put, argument one, is that your Lordships’ House is not the right place: we are not the right people to be making this decision. Many of us have joined since the depths of Covid, but those who were here then will remember when the House went largely remote and lots of people who had never used a computer before were suddenly on Zoom. We met their grandkids: “There you are, Granny. You are off mute now”.

I invite your Lordships to look at the people around you. We are extraordinarily unrepresentative of the country in many ways, but particularly in terms of age. This is where I draw on the argument made by the noble Lord, Lord Russell, but come to a different conclusion. I was also in the learning centre and spoke to some of the same pupils. They overwhelmingly said, “We do not want a ban”. My argument is that we must stop doing politics to young people. We must give young people agency and a sense of control. We have bequeathed to them a disastrous, damaging world; failing to give them a say in this is absolutely the wrong way forward.

On that point, I have a serious proposal for the Minister. In the consultation, are the Government prepared to include a people’s assembly that represents young people from around the country? Rather than just asking young people to tick a box in a survey—we all know what happens with “yes” or “no” votes—this would give them the chance to deliberate on how they think we can control the future and improve their situation.

My second point is important and has not been said before. In this debate we have heard a huge amount of scapegoating of social media. Social media is a mirror: it reflects the misogyny, violence, racism and fake news that runs across and through our society, it does not create it. If we could wave a magic wand and get young people off social media, they would still be affected by the dreadful levels of poverty and the schools that operate as exam factories, putting them under tremendous pressure and subjecting them to unbearable discipline. They would still have parents who are struggling to put food on the table and keep a roof over their heads. They would still encounter all the misogyny and racism in our society. When we are debating and voting on this, we must understand that social media is a mirror; it is not creating where we are now.

Lord Tarassenko Portrait Lord Tarassenko (CB)
- Hansard - -

My Lords, I will speak briefly about the lack of scientific evidence for Amendment 94A in the name of the noble Lord, Lord Nash. No one disputes that rates of suicide, depression, anxiety and self-harm have increased among teenagers in the last decade. However, the question before us is whether a social media ban for under-16s would decrease those rates.

I know that this has been raised by the noble Baroness, Lady Cass, but I still believe that evidence from randomised controlled trials is important, even in this context. There have been no randomised controlled trials of social media bans or restrictions for healthy under-18s. The lack of experimental evidence in adolescent populations may be because it is difficult to experimentally manipulate social media use in such an age group. There was one RCT of 220 adolescents and young adults aged 17 to 25 with pre-existing emotional distress, who were asked to reduce their social media use to one hour a day for three weeks. However, the sample participants selected were all experiencing at least two of four symptoms of depression and anxiety, and should therefore be classified as a clinical sample, not a representative sample of the general population.

There is an RCT of adolescent participants from which we can learn, even though it has not started. It is funded by the Wellcome Trust and it will take place in Bradford and feature adolescents between the ages of 13 and 16. The intervention will not be a ban, but will involve a smartphone app that, importantly, limits the use of social media apps using a co-produced combination of a daily budget of one hour per day and a night-time curfew between 9 pm and 7 am.

The co-production of the trial is very important. We need to hear the voice of young people when designing these interventions. They themselves are very concerned by the negative impacts of social media. Perhaps not surprisingly, the feedback from the teenagers in Bradford schools was that they would be against a ban, but they would be willing to accept significant time limits on their use of social media.

Children’s Wellbeing and Schools Bill

Lord Tarassenko Excerpts
Thursday 18th September 2025

(4 months, 3 weeks ago)

Lords Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Baroness Spielman Portrait Baroness Spielman (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I have listened to a number of Lords speak movingly and wisely about the risks, concerns and things we need to guard against in the use of technology. I want to talk about the risk to learning itself. I have forgotten their name, but somebody recently wrote an excellent piece that illustrated this very vividly.

We all understand that when we send our children to school and when we teach them, the point is not simply for them to have a thing they can say they have done; it is the process that they go through that really embeds it and enables them to use that knowledge and those skills in future.

We have all seen it in the kinds of problems that have arisen with coursework. If a coursework essay or a homework assignment is produced for a child or university student by AI, then that child or student has not done the thinking, they have not learned what the assignment was set for, and the education will not achieve its purpose. There is a real risk at the moment that a lot of education in a lot of places is being quite significantly undermined because young people do not recognise that they are harming themselves by taking the shortcuts. Perhaps we have all been a little bit slow to recognise this risk.

There is a helpful distinction to be made here. I recently read a piece which distinguished between cognitive offloading and cognitive bypasses. The use of assistive technology, such as that which the noble Lord, Lord Addington, has referred to on occasion, might be described as cognitive offloading, where the point is to help the child with the additional challenges they are experiencing without losing the point of the lesson or what they are meant to be learning.

If we get to the point where the technology becomes a way of simply bypassing the learning, we are actually destroying education. The enthusiasm for technology—which has understandably invigorated us all; there are clearly tremendous opportunities—and the incredible energy and power of the tech firms, which of course concentrate immense efforts on Ministers to bring their products and services into schools, mean that there is a massive job for government to do to find that balance and to really understand the risks, not just around data and to children’s well-being but to education.

Lord Tarassenko Portrait Lord Tarassenko (CB)
- View Speech - Hansard - -

My Lords, I support Amendments 493, 494, 502K and 502YI, as someone with an interest in the use of educational technologies, including AI, both in schools and universities. I declare my interest as chair of the Maths Horizons project, funded by XTX Markets, which earlier this year reviewed the maths curriculum in England from five to 18, and briefly investigated the use of edtech to support the teaching of the subject.

I speak as a supporter of the deployment of educational technology in the classroom as I believe it can and should have a positive impact on the education of children, and not just in maths. But this must be done within a framework which protects children from its misuse. We must balance innovation in education through edtech with appropriate regulation. The regulations listed in subsection (2) of the proposed new clause in Amendment 493 would support the adoption of edtech in our schools rather than hinder it.

In this context, what has happened with chatbots based on large language models is a salutary example of the early release of AI products without proper safeguards, especially with respect to their use by children. Tragically, this week the parents of the American teenager who recently took his own life after repeatedly sharing his intentions with ChatGPT told a Senate judiciary sub-committee investigating chatbot dangers:

“What began as a homework helper gradually turned itself into a confidant and then a suicide coach”.


Ironically, we are now told that OpenAI is building a ChatGPT for teenagers and plans to use age-prediction technology to help bar children under 18 from the standard version. Sam Altman, the CEO of OpenAI, wrote in a blog this week just before the Senate hearings—and then coming to this country—that AI chatbots are

“a new and powerful technology, and we believe minors need significant protection”.

The risks associated with the use of edtech may not be on the same scale, but they are nevertheless real. In many cases, edtech products used in schools rely extensively on the collection of children’s data, allowing it to be used for commercial and profiling purposes. The recent report from the 5Rights Foundation and the LSE, which has already been mentioned, highlights that some popular classroom AI apps track users with cookies from adult websites and may provide inaccurate and unhelpful information. Most worryingly, a popular app used for educational purposes in the UK generates emulated empathy through sentiment analysis and so increases the likelihood of children forming an emotional attachment to the app. I therefore support Amendments 493, 494 and 502K, which together would ensure that edtech products provide children with the higher standard of protection afforded by the ICO’s age-appropriate design code.

In addition to the safeguards introduced by these amendments, there is a need for research to establish whether educational technologies deliver better educational outcomes for children. Most edtech products lack independent evidence that they lead to improved outcomes. Indeed, some studies have shown that edtech products can promote repetitive or distracting experiences with minimal, if any, learning values. By contrast, there is a growing body of evidence on the positive side that edtech can effectively support vocabulary acquisition, grammar learning, and the development of reading and writing skills for students for whom English is the second language, particularly when these tools are used to complement a teacher’s instruction.

To establish a causal relationship between the use of an edtech tool and a specific learning outcome, we need to design randomised control trials—the gold standard for demonstrating the efficacy of interventions in the social or medical sciences. Longitudinal data will then be needed to track student usage, time on task and completion rates. Crucially, the trial must have enough participants to detect a meaningful effect if one exists. This is unlikely to be possible using the data from a single school, so data from several schools will need to be anonymised and then aggregated to obtain a statistically meaningful result.

I am satisfied that Amendments 502K and 502YI would allow this methodological approach to be followed. Indeed, subsection (4)(c) of the proposed new clause in Amendment 502K would ensure that the code of practice enabled the development of standards to certify evidence-based edtech products and support the testing of novel products. This would provide UK- based companies with the opportunity to innovate in edtech within an appropriate regulatory environment.

As English is the lingua franca of the digital world, there is the opportunity for the UK to become a leader in edtech innovation and certification, for the benefit of children not only in the UK but in many other countries. These amendments should be seen by the Department for Education not as an attempt to overregulate the edtech sector but instead as a mechanism for the promotion of existing evidence-based apps and the development of a new generation of products, some of which may be AI-facilitated, using—no pun intended—best-in-class trial methodology.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- View Speech - Hansard - - - Excerpts

My Lords, I too support Amendments 493 and 494 in the name of my noble friend Lord Holmes, and Amendments 502K and 502YI in the name of the noble Baroness, Lady Kidron. I am not an educationalist and this is my first contribution on the Bill. I spend my time in this House focused mainly on digital issues, hence my interest in these amendments.

Like others today, I will start by being really clear that I am not anti-technology in education—quite the opposite. I see the huge potential that digital technology can bring in all sectors of our lives. It is also particularly clear today, as our Prime Minister is signing the tech prosperity deal. We should be open-eyed that technology brings the opportunity for prosperity; I am not anti it at all. But it is also really clear that technology, not just digital but all technologies for evermore, need guardrails, and those guardrails cannot be self-imposed.

Among those of us who have worked on child safety online for the past 10 or 15 years, many on this side of the House began firmly believing that self-regulation was the answer. I am afraid that we been proven absolutely wrong. There is no doubt that self-regulation in social media has been a disaster, and I fear that we are doing exactly the same in digital technology in education. Companies operating in this space need guardrails in order to develop the products that really will make a positive difference and to help us all mitigate the downsides that these technologies inherently have.

I am not a lover of adding regulation, so in each example of adding regulation in the digital space I ask myself a simple question: is this additional regulation an example of the red flag Act 1865? For those that do not know, that was the wonderful piece of legislation that required a man—it had to be a man—with a red flag to walk in front of every non-horse-drawn vehicle. This was clearly a very bad piece of legislation that was repealed—it took 30 years, but it was repealed. So question number one is: is this piece of additional digital regulation a red flag Act that will prevent the benefit of the technology, or is it in danger of being a seat belt?

The seat belt was patented in 1885 but it became mandatory to wear one in the back seat of a car, where children tend to sit, only in 1991. So, during that century, was the world better off and was car development so much faster because there were not mandatory seat belts on the back seat, or was it just that more children died? We have to ask ourselves, with every piece of regulation in the digital world: are we in danger of creating a red flag that is slowing down the development of the technology, or are we in danger of believing that regulation will slow down economic growth while instead being culpable of doing harm for decades or even centuries?

The problem with this Bill, and these amendments, is that many of us have debated the issue many times before. The age-appropriate design code came into being in the Data Protection Act 2018 and came to life in 2020. It expressly excludes technology in schools. I find it incomprehensible that, five years later, we are having to argue that it is wrong that children’s data in school is less protected than it is at home. The Minister has referenced previously that many of us have spoken on this topic before or have a track record in this. The Government, when they were not the Government, very clearly supported expanding regulation into edtech. I hope that the Minister will hear the cross-party support for these amendments and work with us to put in statute the appropriate protection for the use of children’s data and technology when they are in education.