(6 days, 16 hours ago)
Grand CommitteeMy Lords, unusually, I rise to move an amendment, Amendment 138. For the second time in Committee, I find myself heading a group when I know that the noble Baroness, Lady Kidron, will be much better qualified to introduce the subject. Indeed, she has an amendment, Amendment 141, which is far preferable in many ways to mine.
Amendment 138 is designed to ensure that the Information Commissioner produces a code of practice specific to children up to the age of 18 for the purposes of UK law and Convention 108, and pupils as defined by the Education Act 1996, who may be up to the age of 19 or, with special educational needs, up to 25 in the education sector. The charity Data, Tech & Black Communities put it this way in a recent letter to the noble Baroness, Lady Jones:
“We recently completed a community research project examining the use of EdTech in Birmingham schools. This project brought us into contact with over 100 people … including parents, school staff and community members. A key finding was the need to make it easier for those with stewardship responsibility for children’s data, to fulfil this duty. Even with current data protection rights, parents and guardians struggle to make inquiries (of schools, EdTech companies and even DfE) about the purpose behind the collection of some of their children’s data, clarity about how it is used (or re-used) or how long data will be retained for. ‘Opting out’ on behalf of their children can be just as challenging. All of which militates against nuanced decision-making about how best to protect children’s short and long-term interests … This is why we are in support of an ICO Code of Practice for Educational Settings that would enable school staff, parents and learners, the EdTech industry and researchers to responsibly collect, share and make use of children’s data in ways that support the latter’s agency over their ‘digital selves’ and more importantly, will support their flourishing”.
The duties of settings and data processers and rights appropriate to the stage of education and children’s capacity needs clarity and consistency. Staff need confidence to access and use data appropriately within the law. As the UNCRC’s General Comment No. 16 (2013) on State Obligations Regarding the Impact of the Business Sector on Children’s Rights set out over a decade ago,
“the realization of children’s rights is not an automatic consequence of economic growth and business enterprises can also negatively impact children’s rights”.
The educational setting is different from only commercial interactions or in regard to the data subjects being children. It is more complex because of the disempowered environment and its imbalance of power between the authority, the parents and the child. The additional condition is the fact that parents’ and children’s rights are interlinked, as exemplified in the right to education described in UDHR Article 26(3), which states:
“Parents have a prior right to choose the kind of education that shall be given to their children.”
A code is needed because the explicit safeguards are missing that the GDPR requires in several places but were left out of the UK Data Protection Act 2018 drafting. Clause 80 of the Bill—“Automated decision-making”—does not address the necessary safeguards of GDPR Article 23(1) for children. Furthermore, removing the protections of the balancing test under the recognised legitimate interest condition will create new risks. Clauses on additional further processing or changes to purpose limitation are inappropriately wide without child-specific safeguards. The volume, sensitivity and intrusiveness of identifying personal data collection in educational settings only increases, while the protections are only ever reduced.
Obligations specific to children’s data, especially
“solely automated decision-making and profiling”
and exceptions, need to be consistent with clear safeguards by design where they restrict fundamental freedoms. What does that mean for children in practice, where teachers are assumed to be the rights bearers in loco parentis? The need for compliance with human rights, security, health and safety, among other standards proportionate to the risks of data processing and respecting the UK Government’s accessibility requirements, should be self-evident and adopted in a code of practice, as recommended in the five rights in the Digital Futures Commission’s blueprint for educational data governance.
The Council of Europe Strategy for the Rights of the Child (2022-2027) and the UNCRC General Comment No. 25 on Children’s Rights and the Digital Environment make it clear that
“children have the right to be heard and participate in decisions affecting them”.
They recognise that
“capacity matters, in accordance with their age and maturity. In particular attention should be paid to empowering children in vulnerable situations, such as children with disabilities.”
Paragraph 75 recognises that surveillance in educational settings should not take place without the right to object and that teachers need training to keep up with technological developments.
Participation of young people themselves has not been invited in the development of this Bill and the views of young people have not been considered. However, a small sample of parent and pupil voices has been captured in the Responsible Technology Adoption Unit’s public engagement work together with the DfE in 2024. The findings back those of Defend Digital Me’s Survation poll in 2018 and show that parents do not know that the DfE already holds named pupil records without their knowledge or permission and that the data is given away to be reused by hundreds of commercial companies, the DWP, the Home Office and the police. It stated:
“There was widespread consensus that work and data should not be used without parents’ and/or pupils’ explicit agreement. Parents, in particular, stressed the need for clear and comprehensive information about pupil work and data use and any potential risks relating to data security and privacy breaches.”
A code of practice is needed to explain the law and make it work as intended for everyone. The aims of a code of practice for educational settings would be that adherence to a code creates a mechanism for controllers and processors to demonstrate compliance with the legislation or approve certification methods. It would give providers confidence in consistent and clear standards and would be good for the edtech sector. It would allow children, parents, school staff and systems administrators to build trust in safe, fair and transparent practice so that their rights are freely met by design and default.
Further, schools give children’s personal data to many commercial companies during a child’s education—not based on consent but assumed for the performance of a task carried out in the public interest. A code should clarify any boundaries of this lawful basis for commercial purposes, where it is an obligation on parents to provide the data and what this means for the child on reaching maturity or after leaving the educational setting.
Again, a code should help companies understand “data protection by design and default” in practice, and appropriate “significant legal effect”, the edges of “public interest” in data transfers to a third country, and how special categories of data affect children in schools. A code should also support children and families in understanding the effect of the responsibilities of controllers and processes for the execution or limitation of their own rights. It would set out the responsibilities of software platforms that profile users’ metadata to share with third parties, or of commercial apps signed up for in schools that offer adverts in use.
I hope that I have explained exactly why we believe that a code of conduct is required in educational settings. I beg to move.
My Lords, I support and have added my name to Amendment 138 in the name of the noble Lord, Lord Clement-Jones. I will also speak to Amendment 141 in my name and those of the noble Lords, Lord Knight and Lord Russell, and the noble Baroness, Lady Harding.
Both these amendments propose a code of practice to address the use of children’s data in the context of education. Indeed, they have much in common. Having heard the noble Lord, Lord Clement-Jones, I have much in common with what he said. I associate myself entirely with his remarks and hope that mine will build on them. Both the amendments point to the same problem that children’s data is scandalously treated in our schools and educators need support; this is a persistent and known failure that both the DfE and the ICO have failed to confront over a period of some years.
Amendment 141 seeks to give a sense of exactly what an education code should cover. In doing so, it builds on the work of the aforementioned Digital Futures for Children centre at the LSE, which I chair, the work of Defend Digital Me, the excellent work of academics at UCL, and much of the work relating to education presented to the UN tech envoy in the course of drafting the UN global digital compact.
Subsection (1) of the proposed new clause would require the ICO to prepare a code of practice in connection with the provision of education. Subsection (2) sets out what the ICO would have to take into account, such as that education provision includes school management and safeguarding as well as learning; the different settings in which it takes place; the need for transparency and evidence of efficacy; and all the issues already mentioned, including profiling, transparency, safety, security, parental involvement and the provision of counselling services.
Subsection (3) would require the ICO to have regard to children’s entitlement to a higher standard of protection—which we are working so hard in Committee to protect—their rights under the UNCRC and their different ages and stages of development. Importantly, it also refers to the need and desire to support innovation in education and the need to ensure that the benefits derived from the use of UK children’s data accrue to the UK.
Subsection (4) lists those whom the commissioner would have to consult, and subsection (5) sets out when data processors and controllers would be subject to the code. Subsection (6) proposes a certification scheme for edtech services to demonstrate compliance with UK GDPR and the code. Subsection (7) would require edtech service and product providers to evidence compliance—importantly, transferring that responsibility from schools to providers. Subsection (8) simply defines the terms.
A code of practice is an enabler. It levels the playing field, sets terms for innovators, creates sandbox or research environments, protects children and supports schools. It offers a particularly attractive environment for developing the better digital world that we would all like to see, since schools are identifiable communities in which changes and outcomes could be measured.
(2 weeks ago)
Grand CommitteeMy Lords, I rise briefly to support the amendments in the name of the noble Lord, Lord Stevenson of Balmacara. I must say that the noble Lord, Lord Clement-Jones, made a very persuasive speech; I shall be rereading it and thinking about it more carefully.
In many ways, purpose limitation is the jewel in the crown of GDPR. It does what it says on the tin: data should be used for the original purpose, and if the purpose is then extended, we should go back to the person and ask whether it can be used again. While I agree with and associate myself with the technical arguments made by the noble Lord, Lord Stevenson, that is the fundamental point.
The issue here is, what are the Government trying to do? What are we clearing a pathway for? In a later group, we will speak to a proposal to create a UK data sovereign fund to make sure that the value of UK publicly held data is realised. The value is not simply economic or financial, but societal. There are ways of arranging all this that would satisfy everyone.
I have been sitting here wondering whether to say it, but here I go: I am one of the 3.3 million.
So is the noble Lord, Lord Clement-Jones. I withdrew my consent because I did not trust the system. I think that what both noble Lords have said about trust could be spread across the Bill as a whole.
We want to use our data well. We want it to benefit our public services. We want it to benefit UK plc and we want to make the world a better place, but not at the cost of individual data subjects and not at too great a cost. I add my voice to that. On the whole, I prefer systems that offer protections by design and default, as consent is a somewhat difficult concept. But, in as much as consent is a fundamental part of the current regulatory system and nothing in the Bill gets rid of it wholesale for some better system, it must be applied meaningfully. Amendments 79, 81 and 131 make clear what we mean by the term, ensure that the definition is consistent and clarify that it is not the intention of the Government to lessen the opportunity for meaningful consent. I, too, ask the Minister to confirm that it is not the Government’s intention to downgrade the concept of meaningful consent in the way that the noble Lord, Lord Stevenson, has set out.
My Lords, I thought I had no speech; that would have been terrible. In moving my amendment, I thank the noble Baronesses, Lady Kidron and Lady Harding of Winscombe, and the noble Lord, Lord Russell of Liverpool, for their support. I shall speak also to Amendments 94, 135 and 196.
Additional safeguards are required for the protection of children’s data. This amendment
“seeks to exclude children from the new provisions on purpose limitation for further processing under Article 8A”.
The change to the purpose limitation in Clause 71 raises questions about the lifelong implications of the proposed change for children, given the expectation that they are less aware of the risks of data processing and may not have made their own preferences or choices known at the time of data collection.
For most children’s data processing, adults give permission on their behalf. The extension of this for additional purposes may be incompatible with what a data subject later wishes as an adult. The only protection they may have is purpose limitation to ensure that they are reconsented or informed of changes to processing. Data reuse and access must not mean abandoning the first principles of data protection. Purpose limitation rests on the essential principles of “specified” and “explicit” at the time of collection, which this change does away with.
There are some questions that I would like to put to the Minister. If further reuses, such as more research, are compatible, they are already permitted under current law. If further reuses are not permitted under current law, why should data subjects’ current rights be undermined as a child and, through this change, never be able to be reclaimed at any time in the future? How does the new provision align with the principle of acting in the best interests of the child, as outlined in the UK GDPR, the UNCRC in Scotland and the Rights of Children and Young Persons (Wales) Measure 2011? What are the specific risks to children’s data privacy and security under the revised rules for purpose limitation that may have an unforeseeable lifelong effect? In summary, a blanket exclusion for children’s data processing conforms more with the status quo of data protection principles. Children should be asked again about data processing once they reach maturity and should not find that data rights have been given away by their parents on their behalf.
Amendment 196 is more of a probing amendment. Ofcom has set out its approach to the categorisation of category 1 services under the Online Safety Act. Ofcom’s advice and research, submitted to the Secretary of State, outlines the criteria for determining whether a service falls into category 1. These services are characterised by having the highest reach and risk functionalities among user-to-user services. The categorisation is based on certain threshold conditions, which include user numbers and functionalities such as content recommender systems and the ability for users to forward or reshare content. Ofcom has recommended that category 1 services should meet either of two sets of conditions: having more than 34 million UK users with a content recommender system or having more than 7 million UK users with a content recommender system and the ability for users to forward or reshare user-generated content. The categorisation process is part of Ofcom’s phased approach to implementing codes and guidance for online safety, with additional obligations for category 1 services due to their potential as sources of harm.
The Secretary of State recently issued the Draft Statement of Strategic Priorities for Online Safety, under Section 172 of the Online Safety Act. It says:
“Large technology companies have a key role in helping the UK to achieve this potential, but any company afforded the privilege of access to the UK’s vibrant technology and skills ecosystem must also accept their responsibility to keep people safe on their platforms and foster a safer online world … The government appreciates that Ofcom has set out to government its approach to tackling small but risky services. The government would like to see Ofcom keep this approach under continual review and to keep abreast of new and emerging small but risky services, which are posing harm to users online.
As the online safety regulator, we expect Ofcom to continue focusing its efforts on safety improvements among services that pose the highest risk of harm to users, including small but risky services. All search services in scope of the Act have duties to minimise the presentation of search results which include or lead directly to illegal content or content that is harmful to children. This should lead to a significant reduction in these services being accessible via search results”.
During the parliamentary debates on the Online Safety Bill and in Joint Committee, there was significant concern about the categorisation of services, particularly about the emphasis on size over risk. Initially, the categorisation was based largely on user numbers and functionalities, which led to concerns that smaller platforms with high-risk content might not be adequately addressed. In the Commons, Labour’s Alex Davies-Jones MP, now a Minister in the Ministry of Justice, argued that focusing on size rather than risk could fail to address extreme harms present on smaller sites.
The debates also revealed a push for a more risk-based approach to categorisation. The then Government eventually accepted an amendment allowing the Secretary of State discretion in setting thresholds based on user numbers, functionalities or both. This change aimed to provide flexibility in addressing high-risk smaller platforms. However, concerns remain, despite the strategy statement and the amendment to the original Online Safety Bill, that smaller platforms with significant potential for harm might not be sufficiently covered under the category 1 designation. Overall, while the final approach allows some flexibility, there is quite some debate about whether enough emphasis will be placed by Ofcom in its categorisation on the risks posed by smaller players. My colleagues on these Benches and in the Commons have emphasised to me that we should be rigorously addressing these issues. I beg to move.
My Lords, I shall speak to all the amendments in this group, and I thank noble Lords who have added their names to Amendments 88 and 135 in my name.
Amendment 88 creates a duty for data controllers and processors to consider children’s needs and rights. Proposed new subsection (1) simply sets out children’s existing rights and acknowledges that children of different ages have different capacities and therefore may require different responses. Proposed new subsection (2) addresses the concern expressed during the passage of the Bill and its predecessor that children should be shielded from the reduction in privacy protections that adults will experience under the proposals. Proposed new subsection (3) simply confirms that a child is anyone under the age 18.
This amendment leans on a bit of history. Section 123 of the Data Protection Act 2018 enshrined the age-appropriate design code into our data regime. The AADC’s journey from amendment to fully articulated code, since mirrored and copied around the world, has provided two useful lessons.
First, if the intent of Parliament is clear in the Bill, it is fixed. After Royal Assent to the Data Protection Act 2018, the tech lobby came calling to both the Government and the regulator arguing that the proposed age of adulthood in the AADC be reduced from 18 to 13, where it had been for more than two decades. Both the department and the regulator held up their hands and pointed at the text, which cited the UNCRC that defines a child as a person under 18. That age remains, not only in the UK but in all the other jurisdictions that have since copied the legislation.
In contrast, on several other issues both in the AADC and, more recently, in the Online Safety Act, the intentions of Parliament were not spelled out and have been reinterpreted. Happily, the promised coroner provisions are now enshrined in this Bill, but promises from the Dispatch Box about the scope and form of the coroner provisions were initially diluted and had to be refought for a second time by bereaved parents. Other examples, such as promises of a mixed economy, age-assurance requirements and a focus on contact harm, features and functionalities as well as content are some of the ministerial promises that reflected Parliament’s intention but do not form part of the final regulatory standards, in large part because they were not sufficiently spelled out in the Bill. What is on in the Bill really matters.
Secondly, our legislation over the past decade is guilty of solving the problems of yesterday. There is departmental resistance to having outcomes rather than processes enshrined in legislation. Overarching principles, such as a duty of care, or rights, such as children’s rights to privacy, are abandoned in favour of process measures, tools that even the tech companies admit are seldom used and narrow definitions of what must and may not be taken down.
Tech is various, its contexts infinite, its rate of change giddy and the skills of government and regulator are necessarily limited. At some point we are going to have to start saying what the outcome should be, what the principles are, and not what the process is. My argument for this amendment is that we need to fix our intention that in the Bill children have an established set of needs according to their evolving capacity. Similarly, they have a right to a higher bar of privacy, so that both these principles become unavoidable.