Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, I will speak to Amendments 142, 143 and 150 in my name, and I thank other noble Lords for their support.

We have spent considerable time across the digital Bills—the online safety, digital markets and data Bills—talking about the speed at which industry moves and the corresponding need for a more agile regulatory system. Sadly, we have not really got to the root of what that might look like. In the meantime, we have to make sure that regulators and Governments are asked to fulfil their duties in a timely manner.

Amendment 142 puts a timeframe on the creation of codes under the Act at 18 months. Data protection is a mature area of regulatory oversight, and 18 months is a long time for people to wait for the benefits that accrue to them under legislation. Similarly, Amendment 143 ensures that the transition period from the code being set to it being implemented is no more than 12 months. Together, that creates a minimum of two and half years. In future legislation on digital matters, I would like to see a very different approach that starts with the outcome and gives companies 12 months to comply, in any way they like, to ensure that outcome. But while we remain in the world of statutory code creation, it must be bound by a timeframe.

I have seen time and again, after the passage of a Bill, Parliament and civil society move on, including Ministers and key officials—as well as those who work at the regulator—and codes lose their champions. It would be wonderful to imagine that matters progress as intended, but they do not. In the absence of champions, and without ongoing parliamentary scrutiny, codes can languish in the inboxes of people who have many calls on their time. Amendments 142 and 143 simply mirror what the Government agreed to in the OSA—it is a piece of good housekeeping to ensure continuity of attention.

I am conscious that I have spent most of my time highlighting areas where the Bill falls short, so I will take a moment to welcome the reporting provisions that the Government have put forward. Transparency is a critical aspect of effective oversight, and the introduction of an annual report on regulatory action would be a valuable source of information for all stakeholders with an interest in understanding the work of the ICO and its impact.

Amendment 150 proposes that those reporting obligations also include a requirement to provide details of all activities carried out by the Information Commissioner to support, strengthen and uphold the age-appropriate design code. It also proposes that, when meeting its general reporting obligations, it should provide the information separately for children. The ICO published an evaluation of the AADC as a one-off in March 2023 and its code strategy on 3 April this year. I recognise the effort that the commissioner has made towards transparency, and the timing of his report indicates that having reporting on children specifically is something that the ICO sees as relevant and useful. However, neither of those are sufficient in terms of the level of detail provided, the reporting cadence or the focus on impact rather than the efforts that the ICO has made.

There are many frustrations for those of us who spend our time advocating for children’s privacy and safety. Among them is having to try to extrapolate child-specific data from generalised reporting. When it is not reported separately, it is usually to hide inadequacies in the level of protection afforded to children. For example, none of the community guidelines enforcement reports published for Instagram, YouTube, TikTok or Snap provides a breakdown of the violation rate data by age group, even though this would provide valuable information for academics, Governments, legislators and NGOs. Amendment 150 would go some way to addressing this gap by ensuring that the ICO is required to break down its reporting for children.

Having been momentarily positive, I would like to put on the record my concerns about the following extract from the email that accompanied the ICO’s children’s code strategy of 2 April. Having set out the very major changes to companies that the code has ushered in and explained how the Information Commissioner would spend the next few months looking at default settings, geolocation, profiling, targeting children and protecting under-13s, the email goes on to say:

“With the ongoing passage of the bill, our strategy deliberately focusses in the near term on compliance with the current code. However, once we have more clarity on the final version of the bill we will of course look to publicly signal intentions about our work on implementation and children’s privacy into the rest of the year and beyond”.


The use of the phrase “current code”, and the fact that the ICO has decided it is necessary to put its long-term enforcement strategy on hold, contradict government assurances that standards will remain the same.

The email from the ICO arrived in my inbox on the same day as a report from the US Institute of Digital Media and Child Development, which was accompanied by an impact assessment on the UK’s age-appropriate design code. It stated:

“The Institute’s review identifies an unprecedented wave of … changes made across leading social media and digital platforms, including YouTube, TikTok, Snapchat, Instagram, Amazon Marketplace, and Google Search. The changes, aimed at fostering a safer, more secure, and age-appropriate online environment, underscore the crucial role of regulation in improving the digital landscape for children and teens”.


In June, the Digital Futures Commission will be publishing a similar report written by the ex-Deputy Information Commissioner, Steve Wood, which has similarly positive but much more detailed findings. Meanwhile, we hear the steady drumbeat of adoption of the code in South America, Australia and Asia, and in additional US states following California’s lead. Experts in both the US and here in the UK evidence that this is a regulation that works to make digital services safer and better for children.

I therefore have to ask the Minister once again why the Government are downgrading child protection. If he, or those in the Box advising him, are even slightly tempted to say that they are not, I ask that they reread the debates from the last two days in Committee, in which the Government removed the balancing test to automated decision-making and the Secretary of State’s powers were changed to have regard to children rather than to mandate child protections. The data impact assessment provisions have also been downgraded, among the other sleights of hand that diminish the AADC.

The ICO has gone on record to say that it has put its medium to long-term enforcement strategy on hold, and the Minister’s letter sent on the last day before recess says that the AADC will be updated to reflect the Bill. I would like nothing more than a proposal from the Government to put the AADC back on a firm footing. I echo the words said earlier by the noble Baroness, Lady Jones, that it is time to start talking and stop writing. I am afraid that, otherwise, I will be tabling amendments on Report that will test the appetite of the House for protecting children online. In the meantime, I hope the Minister will welcome and accept the very modest proposals in this group.

Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, as is so often the case on this subject, I support the noble Baroness, Lady Kidron, and the three amendments that I have added my name to: Amendments 142, 143 and 150. I will speak first to Amendments 142 and 143, and highlight a couple of issues that the noble Baroness, Lady Kidron, has already covered.

--- Later in debate ---
There is more than one way to protect children at school and the amendment in front of us is the simplest and most comprehensive. But whatever the route, the outcome must be that we act. We are seeing school districts in the US take tech providers to court and the UK can do better than that. UK tax-funded schools and teachers must not be left with the responsibility for the regulatory compliance of multibillion-dollar private companies. Not only is it patently unfair, but it illustrates and exacerbates a power asymmetry for which children at school pay the price. To unlock edtech’s full potential, regulatory action and government leadership is not just beneficial but essential. I beg to move.
Baroness Harding of Winscombe Portrait Baroness Harding of Winscombe (Con)
- Hansard - -

My Lords, I rise once again in my Robin role to support the noble Baroness, Lady Kidron, on this amendment. We had a debate on 23 November last year that the noble Baroness brought on this very issue of edtech. Rather than repeat all the points that were made in that very useful debate, I point my noble friend the Minister to it.

I would just like to highlight a couple of quick points. First, in supporting this amendment, I am not anti-edtech in any way, shape or form. It is absolutely clear that technology can bring huge benefits to students of all ages but it is also clear that education is not unique. It is exactly like every other part of society: where technology brings benefit, it also brings substantial risk. We are learning the hard way that thinking that any element of society can mitigate the risks of technology without legal guard-rails is a mistake.

We have seen really clearly with the age-appropriate design code that commercial organisations operating under its purview changed the way they protected children’s data as a result of that code. The absence of the equivalent code for the edtech sector should show us clearly that we will not have had those same benefits. If we bring edtech into scope, either through this amendment or simply through extending the age-appropriate design code, I would hazard a strong guess that we would start to see very real improvements in the protection of children’s data.

In the debate on 23 November, I asked my noble friend the Minister, the noble Baroness, Lady Barran, why the age-appropriate design code did not include education. I am not an expert in education, by any stretch of the imagination. The answer I received was that it was okay because the keeping children safe in education framework covered edtech. Since that debate, I have had a chance to read that framework, and I cannot find a section in it that specifically addresses children’s data. There is lots of really important stuff in it, but there is no clearly signposted section in that regard. So even if all the work fell on schools, that framework on its own, as published on GOV.UK, does not seem to meet the standards of a framework for data protection for children in education. However, as the noble Baroness, Lady Kidron, said, this is not just about schools’ responsibility but the edtech companies’ responsibility, and it is clear that there is no section on that in the keeping children safe in education framework either.

The answer that we received last year in this House does not do justice to the real question: in the absence of a specific code—the age-appropriate design code or a specific edtech code—how can we be confident that there really are the guardrails, which we know we need to put in place in every sector, in this most precious and important sector, which is where we teach our children?

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I am absolutely delighted to be able to support this amendment. Like the noble Baroness, Lady Harding, I am not anti-edtech at all. I did not take part in the debate last year. When I listen to the noble Baroness, Lady Kidron, and even having had the excellent A Blueprint for Education Data from the 5Rights Foundation and the Digital Futures for Children brief in support of a code of practice for education technology, I submit that it is chilling to hear what is happening as we speak with edtech in terms of extraction of data and not complying properly with data protection.

I got involved some years ago with the advisory board of the Institute for Ethical AI in Education, which Sir Anthony Seldon set up with Professor Rose Luckin and Priya Lakhani. Our intention was slightly broader—it was designed to create a framework for the use of AI specifically in education. Of course, one of the very important elements was the use of data, and the safe use of data, both by those procuring AI systems and by those developing them and selling them into schools. That was in 2020 and 2021, and we have not moved nearly far enough since that time. Obviously, this is data specific, because we are talking about the data protection Bill, but what is being proposed here would cure some of the issues that are staring us in the face.

As we have been briefed by Digital Futures for Children, and as the noble Baroness, Lady Kidron, emphasised, there is widespread invasion of children’s privacy in data collection. Sometimes there is little evidence to support the claimed learning benefits, while schools and parents lack the technical and legal expertise to understand what data is collected. As has been emphasised throughout the passage of this Bill, children deserve the highest standards of privacy and data protection—especially in education, of course.

From this direction, I wholly support what the noble Baroness, Lady Kidron, is proposing, so well supported by the noble Baroness, Lady Harding. Given that it again appears that the Government gave an undertaking to bring forward a suitable code of practice but have not done so, there is double reason to want to move forward on this during the passage of the Bill. We very much support Amendment 146 on that basis.