Baroness Kidron debates involving the Department for Education during the 2019 Parliament

Early Years Education

Baroness Kidron Excerpts
Thursday 30th November 2023

(4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, last week I introduced a debate on the impact of edtech on the learning, social development and privacy of children, and more than one noble Lord stood up and said that they were an expert not on tech but on education. So, this afternoon, I hope noble Lords will bear with me because I am standing up and saying that I am not an expert in education but rather in tech, and it is from that perspective that I will make my contribution.

I declare my interests as in the register, specifically as adviser to the AI Institute in Oxford and fellow of the computer department there, and chair of the Digital Futures for Children Centre at the LSE, and of the 5Rights Foundation.

Technology is neither the enemy nor the salvation of the education sector, whether for school age or early-years children. It has magical qualities of interactivity, transporting children to places and spaces they would otherwise not experience. It has the benefit of consistency and predictability, so a good programme or experience can be reproduced an infinite number of times. Technology is multifaceted: it contributes to complex management systems, delivery of services, learning products, devices, safety tech—and, of course, that includes technology that we consider part of the fabric of a child’s life, from TV to radio to talking toys. I want to make it clear at the outset that it is not technology itself but rather the gaps between how it is being sold, used and governed that I seek to highlight.

In June 2022, Nesta, as part of its mission to close the gaps in school readiness, undertook to see whether it could harness the trend of increasing screentime to narrow the gap in language, maths and literacy experienced by children from low-income households. The Nesta report is relentlessly optimistic in assuming a positive role for digital tech, yet the first three of its four key findings were that it is imperative to improve the quality of toddler tech, so it delivers greater benefit for children’s social, emotional and cognitive development; to help parents navigate a crowded market, so it is easier to identify apps worth their children’s time; and to make high-quality content freely available to low-income families. This third recommendation came on the back of the finding that 88% of toddler apps had in-app advertising, 70% had in-app purchases, meaning that you had to pay to progress in a puzzle or a task, and 58% of all of them had low-quality or no educational value at all. The fourth and final recommendation was a call for further research on how technology could boost children’s outcomes.

The Nesta report is worth a read because, even in this refreshingly pro-tech report, the lack of quality in learning and developmental outcomes for children was stark, as was the shameless creation of an advertising market targeted at the under-fives.

I was disturbed to discover that several colleagues recently suggested that there was an outpouring of research showing that early-years development was increasingly inhibited or stalled because of screen use. I asked Children and Screens, at the Institute of Digital Media and Child Development in the United States, to share its evidence of tech impact on early years. With something of a spoiler alert, I shall read the conclusion:

“High-quality, age-appropriate educational content can have positive impacts on learning and socioemotional development—but not over and above the effects of traditional learning or interpersonal interactions. There is little research that technology is particularly beneficial for educational outcomes, and screen time (particularly TV and video games) relates to poorer academic performance. Technology can increase access to education (eg remote learning), but rarely if ever improves upon traditional learning in its current uses. In the meantime, excessive screen time and online interactions, without proper safety precautions and literacy, can expose children to harm”.


In the detail of its findings, the institute provided research from around the world showing that more screen use is related to lower scores on language and literacy development; that higher passive screen time— “passive” is the key word in that sentence—relates to worse working memory; and that passive screentime in the first five years of life correlates with problems with attention and concentration, learning rules, cognitive flexibility and hyperactivity. There is also a whole set of other problems if what they are watching is age-inappropriate.

There is a worry that screen time is currently, and increasingly, displacing peer play in one to three year-olds, resulting in poorer social development. There is a problem with the quality of what children are seeing or doing, whether they are doing it alone or with a carer, and the opportunity cost—that is, what they are not doing while they are looking at the screen. There is also a problem of widespread privacy and safety concerns in an entirely unregulated market. Then there is the problem of the Government’s response, because while the Government have taken a robust view of the need to regulate tech, particularly in relation to children, they have consistently exempted educational settings, creating a bizarre situation where a child’s privacy and safety protections are worse in education and care settings than they are outside. Leaving tech outside any formal oversight has resulted in the free flow of products and services that claim to be educational but have no right or reason to be considered educational and are gathering children’s personal data at an alarming rate.

While the age appropriate design code brought forward in the Data Protection Act 2018, started in this very Chamber, brought in wide-ranging design changes to tech platforms to protect children’s privacy, an exemption is made for schools and education settings. In many cases, edtech providers do not have to provide the high bar of privacy by design afforded by the AADC, the impact of which I set out last week in debate and can be found in Hansard. In short, there is an eyewatering flood of children’s personal and intimate data straight into the commercial sector.

Similarly, however much I welcome the Online Safety Act, it states:

“A user-to-user service or a search service is exempt if … the provider of the service is … the person with legal responsibility for education or childcare … a person who is employed or engaged to provide education or childcare”


or if

“the service is provided for the purposes of that education or childcare”.

The true impact of this exemption will not be fully understood until we see the detail of Ofcom’s children’s code, but I believe that it will result in some rather contradictory outcomes in which tech providers have fewer duties to children in education and childcare settings than when they access the same or similar services from a bus.

Turning to the need for standards and certification of the tech itself, I want to briefly mention, as I did last week, the work of Dr Laura Outhwaite, a researcher from UCL, who while looking at maths apps for under-fives found that of the top 25 only one had been peer reviewed, half did not meet good practice of learning support and six had no maths content at all.

There is a consensus across many studies and academics that edtech that is worth a child’s time needs four things. It needs to promote active learning, which means activating mental activity on the child’s part and not just clicking or swiping. It needs to consist of learning material, which means engaging with, rather than distracting from, the learning goal—that is, it needs to not include advertising, mini-games or other things that distract and collect data. It needs to be meaningful and relatable, which means providing scaffolding from what children already know or can relate to to support new learning. It needs to include social interaction. That is key, since passive watching has significantly poorer outcomes, so it should encourage interpersonal interaction and use parasocial relationships rather than encouraging exclusively solo play. In addition to those four requirements, children at schools and in early years provision should be afforded privacy and safety equal to or greater than that afforded in other settings.

On the idea that the provision of, and compliance with, safeguarding standards that are routinely delivered by school and carers is equal to the ICO, Ofcom or an edtech standards and certification body—which currently does not exist but is sorely needed—I ask noble Lords to imagine why we expect a nursery teacher to check the privacy or security of an app. Why is it okay for a company to provide a sales pitch to a teacher or a school leader that fails to mention that there is no, or poor, educational benefit, as is found in 58% of edtech?

That rather disheartening list should be seen in the context that high-quality digital media that encourages engagement and conversation can inspire and educate even the youngest child, which means that the quality and format of what children are given matter, in many cases just as much as the amount of time they are doing it and whether they are using tech in a shared context with truly interested and focused adult engagement. Although it remains the view of paediatric associations both here and in the US that under-twos should have no screen time other than for video calling, there are real opportunities that we are missing because of the poor oversight and wrongheaded view that schools and early years safeguarding adequately covers tech from a regulatory point of view. At the same time, we provide no standards for the tech itself.

I want to associate myself with the comments of the noble Baroness, Lady Andrews, and add to her concerns that where money is an issue, tech is often considered to be the answer. I urge all noble Lords to take seriously the role of tech in this situation.

I have been raising these issues for some time with Ministers, regulators and in debates about technology, and I hope that in joining with those of you who are experts in education, we can focus on something which I believe to be, at best, a terrible oversight and, at worst, a failure to respond to a known harm, or a series of known harms, and that together we can address these issues across disciplines.

Before I sit down, I have a couple to questions for the Minister, some of which will be familiar to her. Does she agree that it would benefit children, parents, teachers and carers if there were a system of certification and quality control across the edtech sector, and that the privacy of children in school, where data shared is both sensitive and compulsory to provide, is an urgent matter and should be covered by the upcoming data protection Bill? That would be a useful conversation between the education department and DSIT. Will the Minister agree to ask officials to consider formally how the decision to exempt from the Online Safety Act schools and early learning might impact children in education and childcare settings? I look forward to her response.

Educational Technology

Baroness Kidron Excerpts
Thursday 23rd November 2023

(4 months, 1 week ago)

Grand Committee
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Asked by
Baroness Kidron Portrait Baroness Kidron
- Hansard - -

To ask His Majesty’s Government what assessment they have made of the role of educational technology (ed tech) being used in schools in relation to (1) the educational outcomes, (2) the social development, and (3) the privacy of schoolchildren.

Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - -

My Lords, I declare my interests, particularly that of chair of the Digital Futures Commission, which published the Blueprint for Educational Data in 2022, as chair of 5Rights Foundation and adviser to the Institute for Ethics in AI in Oxford.

School is a place of learning and an environment where children build relationships, life choices are made and futures initiated. For most children, school is compulsory, so while they are there, the school is in loco parentis. I welcome the use of technology, whether for learning or management, but it is uniquely important that it meets the school’s multiple responsibilities for the children in its care.

The debate this afternoon asks us to consider the impact of edtech on learning, privacy and the social development of children. Each could fill a debate on its own, but in touching on all three, I wish to make the point that we need standards and oversight of all.

For more than a decade, Silicon Valley, with its ecosystem of industry-financed NGOs, academics and think tanks, has promised that edtech would transform education, claiming that personalised learning would supercharge children’s achievements and learning data would empower teachers, and even that tech might in some places replace teachers or reach students who might otherwise not be taught.

Meanwhile, many teachers and academics worry that the sector has provided little evidence for these claims. A recent review by the UCL’s Centre for Education Policy found that, of 25 of the most popular maths apps for children aged five, only one had been empirically evaluated for positive impacts on maths outcomes. Half of them did not include features known to support learning, such as feedback loops, and six of the 25 contained no mathematical content at all. If the UCL finding was extrapolated across the half a million apps labelled “education apps” in the app store, 480,000 would not be evaluated, a quarter of a million would provide no learning support and 120,000 would have no educational content at all. The lack of quality standards is not restricted to apps but is widely spread across all forms of edtech. Of course we should have tech in school, but it must be educationally sound.

Covid supercharged the adoption of edtech and, while we must not conflate remote learning with edtech in the classroom, the Covid moment offers two important insights. First, as forensically set out in the UNESCO publication An Ed-Tech Tragedy, the “unprecedented” dependence on technology worsened disparities and learning loss across the world—including in Kenya, Brazil, the United States and Britain. Unsurprisingly, in each country the privileged children with space, connectivity, their own device and an engaged adult had better outcomes than their peers. A more surprising finding was that, where there was no remote learning at all but children were supplied with printouts or teaching via TV or radio, the majority of students did better. The exact reasons are complex but, in short, teaching prepared by teachers for students whom they know, unmediated by the values and normative engineering practices of Silicon Valley, had better outcomes. UNESCO calls on us to ensure that the promises of edtech are supported by evidence.

Secondly, Covid embedded edtech in our schools. Sixty-four per cent of schools introduced, increased or upgraded their technology with no corresponding focus on pupil privacy. In 2021, LSE Professor Sonia Livingstone and barrister Louise Hooper for the Digital Futures Commission mapped the journey of pupil data on Google Classroom and Class Dojo. Their report showed children’s data leaking from school and homework assignments into the commercial world at eye-watering scale, readily available to advertisers and commercial players without children, parents or teachers even knowing.

It is worth noting that, in 2021, the Netherlands negotiated a contract that restricted the data that Google’s education products could share. In 2022, Helsingør in Denmark banned Google Workspace and Chromebooks altogether—the same year the French Ministry of Education urged schools to stop using free versions of both Google and Microsoft.

Children’s privacy is non-trivial. Data may include school attendance, visits to the nurse, immigration status, test results, disciplinary record, aptitude and personality tests, mental health records, biometric data, or the granular detail of how a child interacted with an educational product—whether they hesitated or misspelled. Between management platforms, multiple connected devices and programmes used for teaching, the data that can be collected on a child is almost infinite and the data protection breathtakingly poor. Pupil data has been made available to gambling firms and advertisers, and even been found to track their use of mental health services.

I turn briefly to the impact on social development. Child development is a multifaceted affair, in which not only the tech itself but the opportunity cost—that is, what the child is not doing—is of equal import. I was in Manchester last week, where a programme to bring professional dancers to nursery schools is being developed because children were arriving unable to play, look each other in the eye or move confidently. Although schools are not to blame if children come in overstimulated and undersocialised, in part because of the sedentary screen time of early years, it is absolutely crucial that school remains a place of movement, singing, playing, drawing, reading and class teaching, supported by tech but not replaced by it, not only in a handful of Manchester nurseries but throughout the school system, and, very importantly, during the teenage years. Decisions about edtech should be in the light of and in response to not simply learning but the whole child and their development needs.

In my final minutes, I will speak briefly about safety tech. Here, I record my gratitude to Ministers and officials in the Department for Education, past and present, who have made very significant progress on this issue this year.

Frankie Thomas was 15 when she accessed a story that promoted suicide on a school iPad that had not been connected to the school filtering system. Subsequently, she took her own life exactly as she had seen online. Since that time, her parents, Judy and Andy, have campaigned tirelessly to bring the governance of safety tech to our notice. They deserve much credit for the advances that have been made. However, we still do not have standards for safety tech in schools. Schools can buy, and are buying, in good faith, systems that fail to search for self-harm or have illegal content filters switched off and so on. Secondarily, while we have excellent new guidance, Ofsted inspections do not explicitly ask whether schools are reviewing and checking that their online safety systems are working, meaning that thousands of schools have not properly engaged with that guidance.

I gave the Minister notice of my questions and very much look forward to her response. Will the department introduce quality control for edtech, including peer review and certification that evidences that it is suitable to meet children’s educational and development needs? Will the department use the upcoming Data Protection and Digital Information Bill to introduce a data protection regime for schools, which is so urgently needed? Will the department introduce standard procurement contracts, such as the Netherlands has, recognising that a single school cannot negotiate performance and privacy standards with global companies? Will the department bring forward a requirement for minimum standards of filtering and monitoring so that safety systems are fit for purpose, and simultaneously ensure that Ofsted’s inspecting schools handbook explicitly requires an inspector to ask whether a school is regularly checking its safety tech?

I am deeply grateful to all noble Lords who have chosen to speak and look forward to their contributions. Education is an extremely precious contribution to child development and widely regarded as a public good. It must not be undermined by allowing an unregulated market to develop without regard for the learning, privacy and safety of children.