Biometric Recognition Technologies in Schools Debate

Full Debate: Read Full Debate

Biometric Recognition Technologies in Schools

Lord Clement-Jones Excerpts
Thursday 4th November 2021

(3 years ago)

Lords Chamber
Read Full debate Read Hansard Text
Asked by
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

To ask Her Majesty’s Government what assessment they have made of the use of facial and other biometric recognition technologies in schools.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, I start by acknowledging the versatility of the noble Baroness, Lady Chisholm, in responding to this debate.

A little over two weeks ago, the news broke in the Financial Times that facial recognition software in cashless payment systems, piloted in a Gateshead school last summer, had been adopted in nine Ayrshire schools. Questions have already been asked in the Scottish Parliament by my colleague Willie Rennie MSP, but it is clear that this software is becoming widely adopted on both sides of the border, with 27 schools already using it in England and another 35 or so in the pipeline.

The Court of Appeal, in Bridges v the Chief Constable of South Wales Police, the case brought by Liberal Democrat councillor Ed Bridges in south Wales, noted that:

“Biometric data enables the unique identification of individuals with some accuracy. It is this which distinguishes it from many other forms of data.”


The supplier in question, CRB Cunninghams, attempted to reassure on the basis that

“this is not a normal live facial recognition system”

and:

“It’s not recording all the time. And the operator at the till point has to physically touch the screen.”


According to North Ayrshire Council’s published data impact assessment, the source of the data for facial recognition is a faceprint template. The facial recognition software used mathematically maps an individual’s facial features, such as the length and width of the nose, the distance between the eyes and the shape of the cheekbones, and it stores this data as a faceprint template. That is the description of the technology. Its use has been temporarily paused by North Ayrshire Council, after objections from privacy campaigners and an intervention from the Information Commissioner’s Office. But it is extraordinary to use children’s biometric data for this purpose, when there are so many alternatives available for cashless payment.

From the surveys and evidence given to the Ada Lovelace Institute, which has the ongoing Ryder review of the governance of biometric data, it is clear that the public already have strong concerns about the use of this technology. Yet we seem to be conditioning society to accept biometric and surveillance technologies in areas that have nothing to do with national security or crime prevention and detection. In Scotland, there is a new biometrics commissioner, who will oversee a biometrics code of practice. In England, we have the Biometrics and Surveillance Camera Commissioner, who oversees the surveillance camera code, which is being revised, subject to consultation. However, neither code applies in schools.

It seems that the Department for Education issued guidance in 2018 on the provisions of the Protection of Freedoms Act, which include the

“Protection of biometric information of children in schools”


and the rights of parents and children as regards participation, but that the DfE has no data on the use of biometrics in schools. It seems that there are no compliance mechanisms to ensure that schools observe the Act or, indeed, the guidance that the department has put out.

There is also the broader question about whether, under GDPR and data protection law, biometrics can be used at all, given the age groups involved—because of what is called the “power imbalance”, which makes it hard to refuse, whether or not pupils’ or parents’ consent had been obtained. But how was their consent actually obtained? What information was given to them when obtaining it? What other functions might be applied in the school—attendance records, for instance? Pippa King, who made the original freedom of information request to North Ayrshire Council and published the “Biometrics in Schools” blog, understands that children as young as 14 may have been asked for their consent.

It is not enough for the schools in question to carry out a data impact assessment. The DPIA carried out by North Ayrshire Council was clearly inadequate. The Scottish First Minister, despite saying that

“Facial recognition technology in schools does not appear to be proportionate or necessary”,


went on to say that schools should

“carry out a privacy impact assessment … and consult pupils and parents.”

But this does not go far enough; we should firmly draw a line against it. It is totally disproportionate and unnecessary. Many of us think that this is the short cut to a widespread surveillance state. In some jurisdictions—New York, France and Sweden—its use in schools has already been banned or severely limited.

Of course, I acknowledge that other forms of AI have benefits for some educational purposes. I had the privilege to chair the advisory committee of the Institute for Ethical AI in Education, founded by Sir Anthony Seldon, Professor Rosemary Luckin and Priya Lakhani. In March this year, it produced the Ethical Framework for AI in Education, which has been signed up to by a number of edtech companies. It provides exactly the kind of framework to assess the adherence to principles of the AI applications procured and applied in education settings.

However, this is a particularly worrying example of the way that public authorities are combining the use of biometric data with AI systems, without proper regard for ethical principles. Despite the Bridges case, the Home Office and the police have driven ahead with the adoption of live facial recognition technology, and the College of Policing has been commissioned to deliver guidance on its use in policing—but there is no legislation.

As the Ada Lovelace Institute and Big Brother Watch have urged, and as the Commons Science and Technology Committee recommended in 2019, there should be a voluntary pause on the sale and use of live facial recognition technology to allow public engagement and consultation to take place. I introduced a Private Member’s Bill last year along the same lines. In their massively late response this year to the Select Committee’s call, the Government insisted that the introduction of LFR would proceed. In follow-up correspondence, they claimed there is already a comprehensive legal framework, which they were taking measures to improve. When we are faced with this kind of biometric data capture from young people, and given the increasing danger of damage to public trust, will the Government rethink their very complacent response? As it is, in the proposed EU AI law, live facial recognition technology is regarded as high risk and subject to specific limitations. Will the Government’s expected White Paper on AI governance at least take the same approach?

I return to the use of live facial recognition in schools, which is a highly sensitive area. We should not be using children as guinea pigs. I understand that an ICO report is under way. I hope that it will be completed as a matter of urgency, but we must already conclude that we urgently need to tighten our data protection laws to ensure that children under the age of 18 are much more comprehensively protected from facial recognition technology. I look forward to the Minister’s response.