To ask Her Majesty’s Government what assessment they have made of the use of facial and other biometric recognition technologies in schools.
My Lords, I start by acknowledging the versatility of the noble Baroness, Lady Chisholm, in responding to this debate.
A little over two weeks ago, the news broke in the Financial Times that facial recognition software in cashless payment systems, piloted in a Gateshead school last summer, had been adopted in nine Ayrshire schools. Questions have already been asked in the Scottish Parliament by my colleague Willie Rennie MSP, but it is clear that this software is becoming widely adopted on both sides of the border, with 27 schools already using it in England and another 35 or so in the pipeline.
The Court of Appeal, in Bridges v the Chief Constable of South Wales Police, the case brought by Liberal Democrat councillor Ed Bridges in south Wales, noted that:
“Biometric data enables the unique identification of individuals with some accuracy. It is this which distinguishes it from many other forms of data.”
The supplier in question, CRB Cunninghams, attempted to reassure on the basis that
“this is not a normal live facial recognition system”
and:
“It’s not recording all the time. And the operator at the till point has to physically touch the screen.”
According to North Ayrshire Council’s published data impact assessment, the source of the data for facial recognition is a faceprint template. The facial recognition software used mathematically maps an individual’s facial features, such as the length and width of the nose, the distance between the eyes and the shape of the cheekbones, and it stores this data as a faceprint template. That is the description of the technology. Its use has been temporarily paused by North Ayrshire Council, after objections from privacy campaigners and an intervention from the Information Commissioner’s Office. But it is extraordinary to use children’s biometric data for this purpose, when there are so many alternatives available for cashless payment.
From the surveys and evidence given to the Ada Lovelace Institute, which has the ongoing Ryder review of the governance of biometric data, it is clear that the public already have strong concerns about the use of this technology. Yet we seem to be conditioning society to accept biometric and surveillance technologies in areas that have nothing to do with national security or crime prevention and detection. In Scotland, there is a new biometrics commissioner, who will oversee a biometrics code of practice. In England, we have the Biometrics and Surveillance Camera Commissioner, who oversees the surveillance camera code, which is being revised, subject to consultation. However, neither code applies in schools.
It seems that the Department for Education issued guidance in 2018 on the provisions of the Protection of Freedoms Act, which include the
“Protection of biometric information of children in schools”
and the rights of parents and children as regards participation, but that the DfE has no data on the use of biometrics in schools. It seems that there are no compliance mechanisms to ensure that schools observe the Act or, indeed, the guidance that the department has put out.
There is also the broader question about whether, under GDPR and data protection law, biometrics can be used at all, given the age groups involved—because of what is called the “power imbalance”, which makes it hard to refuse, whether or not pupils’ or parents’ consent had been obtained. But how was their consent actually obtained? What information was given to them when obtaining it? What other functions might be applied in the school—attendance records, for instance? Pippa King, who made the original freedom of information request to North Ayrshire Council and published the “Biometrics in Schools” blog, understands that children as young as 14 may have been asked for their consent.
It is not enough for the schools in question to carry out a data impact assessment. The DPIA carried out by North Ayrshire Council was clearly inadequate. The Scottish First Minister, despite saying that
“Facial recognition technology in schools does not appear to be proportionate or necessary”,
went on to say that schools should
“carry out a privacy impact assessment … and consult pupils and parents.”
But this does not go far enough; we should firmly draw a line against it. It is totally disproportionate and unnecessary. Many of us think that this is the short cut to a widespread surveillance state. In some jurisdictions—New York, France and Sweden—its use in schools has already been banned or severely limited.
Of course, I acknowledge that other forms of AI have benefits for some educational purposes. I had the privilege to chair the advisory committee of the Institute for Ethical AI in Education, founded by Sir Anthony Seldon, Professor Rosemary Luckin and Priya Lakhani. In March this year, it produced the Ethical Framework for AI in Education, which has been signed up to by a number of edtech companies. It provides exactly the kind of framework to assess the adherence to principles of the AI applications procured and applied in education settings.
However, this is a particularly worrying example of the way that public authorities are combining the use of biometric data with AI systems, without proper regard for ethical principles. Despite the Bridges case, the Home Office and the police have driven ahead with the adoption of live facial recognition technology, and the College of Policing has been commissioned to deliver guidance on its use in policing—but there is no legislation.
As the Ada Lovelace Institute and Big Brother Watch have urged, and as the Commons Science and Technology Committee recommended in 2019, there should be a voluntary pause on the sale and use of live facial recognition technology to allow public engagement and consultation to take place. I introduced a Private Member’s Bill last year along the same lines. In their massively late response this year to the Select Committee’s call, the Government insisted that the introduction of LFR would proceed. In follow-up correspondence, they claimed there is already a comprehensive legal framework, which they were taking measures to improve. When we are faced with this kind of biometric data capture from young people, and given the increasing danger of damage to public trust, will the Government rethink their very complacent response? As it is, in the proposed EU AI law, live facial recognition technology is regarded as high risk and subject to specific limitations. Will the Government’s expected White Paper on AI governance at least take the same approach?
I return to the use of live facial recognition in schools, which is a highly sensitive area. We should not be using children as guinea pigs. I understand that an ICO report is under way. I hope that it will be completed as a matter of urgency, but we must already conclude that we urgently need to tighten our data protection laws to ensure that children under the age of 18 are much more comprehensively protected from facial recognition technology. I look forward to the Minister’s response.
My Lords, I congratulate my friend the noble Lord, Lord Clement-Jones—he is a friend—on calling this important debate. I salute his stamina in having participated in the previous debate and seamlessly moved on to lead this debate today. It is a mark of his global influence that, only yesterday, Facebook announced that it was withdrawing all of its facial recognition technology from its site. That technology has been around for some 10 years, and a billion people have consented to have Facebook use it on them, but the minute the noble Lord put down this debate, his colleague Nick Clegg clearly thought, “This is an issue I need to look into”. Who knows why Facebook really made this decision? One could take a noble view that it did so because it thinks that it is intrusive and unnecessary, or a cynical view: it is not making the company any money, so why put itself in harm’s way by continuing to use it? This is an important point.
I will talk more widely about the regulation of facial recognition technology, which is the issue that the noble Lord has put in front of us, with a particular focus on schools. It is a classic example of where technology has outpaced, as it were, the ability of regulators and policymakers to keep up to date. In many respects, facial recognition technology can have benign uses. I suspect that quite a lot of people in this Chamber open their phones using facial ID. We have our faces scanned when we move through the electronic gates at airports, when they are working. We can use facial recognition technology to organise our photos on our phones. More and more airlines are introducing facial recognition technology to allow you to check in seamlessly. So as a customer service to which you voluntary opt in, it is a good thing.
However, as the noble Lord pointed out, there are of course the inevitable and justifiable concerns about the creation of a big brother society—one that is made worse by the deployment of this technology while it is still in its relative infancy. It is one thing to debate its use in the UK but quite another to see how it is being used in a country such as China, where I gather that it is now an offence to leave your home without your phone. It was a reason why so many Hong Kong demonstrators wore masks.
One of the big problems with deploying facial recognition technology, apart from it being a bit of a word sandwich, is that it is in its infancy. We know it can be subject to bias. Frankly, it works more accurately for white men and white women. Amazon’s facial recognition tool incorrectly identified 28 Members of Congress as people who had been arrested, according to a test conducted by the American Civil Liberties Union. According to a paper published by the Massachusetts Institute of Technology and Stanford University, the technology struggles to identify people of colour and women. It has some rate of error even when operated in an unbiased way.
This has led to a decline in public support, which has dropped from about 50% to just over 40% in recent polls conducted in the USA. As my noble friend quite rightly highlighted, the debate is well under way. It is happening not just in this country but in the US, where the House Committee on Oversight and Reform has hosted hearings, and in individual US cities; for example, San Francisco’s Board of Supervisors passed a measure to ban the use of this software by law enforcement.
It is not just policymakers. Quite rightly, some significant companies—including Microsoft and Amazon, for example—have sought to get ahead of this debate and call on policymakers to regulate facial recognition. IBM published an important paper on facial recognition technology, which says that it should be used only where you have the ability to be given notice that it is being used and to consent. It called for export controls on facial recognition technology where it might be used for law enforcement or military purposes and said that law enforcement authorities should be mandated to disclose facial recognition technology and publish regular transparency reports. As my noble friend points out, the Information Commissioner’s Office has, as I understand it, been closely monitoring facial recognition technology trials, particularly those carried out by the British police, and is reviewing the regulations surrounding it.
It is important that this debate highlights that there remains a gap in how facial recognition technology is regulated and uncertainty over whether it falls in the regulations that apply to surveillance cameras and CCTV, and that we need a clear direction from the Government as to which bodies are responsible for overseeing the use of facial recognition technology—whether it is the ICO because it is a data protection issue, or the education authorities focusing on it as an education issue. It is also important that clear guidance is put out, so that people wanting to use facial recognition technology—as I say, there are many benign and quite convenient use cases for it—are aware of the basic principles they should adopt when they deploy this technology.
My Lords, although I chair the Equality and Human Rights Commission, I emphasise that I am speaking in a personal capacity today. Not only that—I am speaking as a new entrant to this area, so I am particularly grateful to the noble Lord, Lord Clement-Jones, for securing this debate and spelling out the risks so clearly to a lay person such as me.
The one point where I will interject the EHRC into this discussion is to tell the House that in our new strategic plan, which commences in 2022 and runs until 2025, we have decided that one of our workstreams should focus on AI and associated technologies. We took this decision earlier this year, for several reasons. The regulatory space is very fragmented and inadequate, in our view. While developments in technology are transforming people’s lives for the better, the impacts are not yet well understood and, where they are, we are starting to see the harmful impact that some technologies have on individuals’ equality and human rights.
As the regulator of the public sector equality duty, as well as human rights law, the EHRC is taking an active interest in the discriminatory and potentially biased outcomes that some of these technologies have for the legal protections afforded to people, particularly on the basis of race and sex. We are seeing increasing numbers of cases involving race and technology, where it is alleged that facial recognition technology has failed—not least in the Uber cases supported by the EHRC, in which two drivers are taking the company to court on the basis that they have lost their jobs because the technology failed to recognise them as a form of ID when they were signing on to work. For women, we know that it is more inaccurate when you add being female to having darker skin. Therefore, the potential for inaccuracy increases. The danger of discrimination against these groups is very much on our radar.
On today’s topic, I share many of the concerns already voiced. I therefore join others in welcoming the belated climbdown from Facebook, which is deleting 1 billion facial recognition templates and shutting down the features that automatically recognise people in photos. Like the noble Lord, Lord Vaizey, I wondered what brought it to time this announcement so carefully in the light of the noble Lord, Lord Clement-Jones, securing this debate. I fear it was Mammon rather than good intention that took it to this point.
Of course, the fact that Facebook is doing this is not sufficient. It will keep to itself the power to use the technology when it sees fit—verifying identities or unlocking hacked phones, it tells us. Troublingly, according to the Financial Times, the algorithm behind the technology, DeepFace, which has been trained using the data of 1 billion scans, will remain extant, to be deployed elsewhere for future products, most likely in the metaverse—so very similar, in my mind, to Covid and the whack-a-mole strategy. What we know from that was that Covid kept popping up in different variants in different times and places. Watch this space with DeepFace.
I note too the broader question as to why we have arrived at a situation in which it is left to private companies to decide when their technology is too harmful, or perceived to be, and autonomously decide to limit its use. Where in the regulatory space will it be decided that DeepFace’s algorithm can and should use the data still held?
On the exploitation of children, we have suspected for years that the social media firms do not have the safety of children uppermost in their minds, and this has been palpably brought home in Frances Haugen’s testimony in the past few weeks. What is worrying in the decision in Scotland to allow the use of FRT in nine schools is that it was to be deployed merely as a post-Covid efficiency measure. I do not think I am alone in this House in thinking that we will spend years undoing moves introduced during Covid that are allowed to remain on the statute book until we find that they are being used in a wholly disproportionate manner in terms of equality and human rights. In plain English, schools would have been better advised to improve the take-up of the vaccine among their children as a post-Covid measure if they really wanted children to mingle more safely while waiting for meals. I welcome the Information Commissioner’s intervention in this matter. There appear to be different approaches to solving the problem that may well be more proportionate than holding the biometric data of children who will almost certainly not be aware of the implications of their consent for privacy at this point in time.
I will end with a few words on the broader importance of being vigilant to emerging technologies. For the very first time, we are in a position in which decisions that affect all aspects of our lives are being taken in the absence of an accountable and identifiable human in the frame. Our legal systems around the world still rest on the assumptions that we can identify a decision-maker and hold them accountable. They are not designed to hold machines accountable, especially where the originator of the learning—so to speak—is well removed from its usage. We are increasingly entering a world in which finding the human behind the decision is impossible for ordinary people seeking redress.
I end by asking the Minister whether she agrees that what is needed is to strengthen existing protections for this AI-driven world that offer clear legal remedies for people wronged that go beyond data privacy and allow us to know as a matter of right who holds what data on us, how it is being used and, importantly, how much is being transferred, at what profit, to others without our knowledge. Will the Government put in place legal protections that make it clear when an algorithm is being used to take decisions about us and what data lies behind those decisions? Most importantly, will senior managers need to be made accountable for flawed decisions by their systems and organisations, with clear remedies available for those on the losing end of those decisions?
I fear that the Government will respond with platitudes about their new determination to regulate in this space. I think we are past the point of determination and now need to find evidence of a readiness to confront this challenge.
My Lords, I thank my noble friend Lord Clement-Jones for instigating this very important and in fact fundamental debate about the use of biometric technology in schools. I also thank Pippa King from Biometrics in Schools, Jen Persson of Defend Digital Me, and Dr Stephanie Hare, for discussing with me some of the fundamental issues.
As a society, we are putting the cart before the horse if we talk about the technology and how it should be deployed in schools as an automatic assumption. The marketing departments of these companies are leading the debate, not the legislators, if we start from that assumption. To put it in its simplest and most understandable way, we are having this debate to ask whether it is acceptable for us as a society to use a child’s face as a proxy purse or wallet to pay for a bag of chips or a slice of pizza in a state school, to solve a problem that does not exist, namely reducing queuing times by five seconds. This debate is not about technology; it is about the use of a child in terms of the autonomy of that child’s body.
This debate is very fundamental. It is a debate about where we, as a society, draw the line in the use of technology—not about what we do once it is deployed but what the limitations of it are before we start talking about how it is regulated. Where do we draw the line? This cannot be left to individual schools or councils. It is for this Parliament to legislate and to decide where we draw that line. As a nation we need to see where the limitations of its use are and where it should not be deployed, and then to regulate in areas where we feel that it is unacceptable.
If we leave it to individual schools, the unintended consequences and problems that will arise will be not just technical but deeply ethical and societal. There must be a balanced debate within this Parliament and legislation must be brought forward. We have seen the unintended consequences in live facial recognition use by the police when the marketing teams and the technology gets ahead of the legislation. We talk then about the lack of regulation, rather than first talking about where it is acceptable and unacceptable and we start seeing that, as the technology leads, people’s rights are trampled on and we try to play catch-up.
The Department for Education has no idea what the current situation is. An FOI request from the campaigning work by Pippa King of Biometrics in Schools highlights this. On 28 July, the DfE replied to an FOI request:
“The DfE does not hold any information on standards or specifications of any hardware or software in biometric technology used in UK schools ... The DfE does not hold any information about suppliers that provide biometric technology to schools ... The DfE does not hold any information about the types of biometrics that are used in schools, i.e. fingerprints, facial recognition, palm, vein or iris scanning.”
What is the point of giving out guidance if the department has no idea what is going on in schools? The guidance is not worth the paper it is written on if the DfE is not policing what is going on.
Current advice to schools, issued by the Department for Education on the use of biometric technology, is out of date. As my noble friend Lord Clement- Jones said, it dates from March 2018. It still cites the Data Protection Act 1998, not the GDPR or the Data Protection Act 2018, and its contents focus on the Protection of Freedoms Act 2012 and the processing of fingerprints. It says absolutely nothing about facial recognition technology.
Can I ask the Minister, whom I admire for stepping in at the last moment, why the 2018 guidance is out of date? What has it not been updated and why is there no guidance whatever on facial recognition in schools? On such an important issue, why does the Department for Education not have some form of monitoring what is happening? Where do the Government draw the line on what is an acceptable use of this technology in schools and on young people below the age of 18?
It does not have to be like this. There are places around the world which have legislated. In 2014, Florida drew the line. It has a law saying that it is illegal in any school to:
“Collect, obtain, or retain information on the political affiliation, voting history, religious affiliation, or biometric information of a student, a parent or sibling of the student. For purposes of this subsection, the term ‘biometric information’ means information collected from the electronic measurement or evaluation of any physical or behavioral characteristics that are attributable to a single person, including fingerprint characteristics, hand characteristics, eye characteristics, vocal characteristics, and any other physical characteristics used for the purpose of electronically identifying that person with a high degree of certainty. Examples of biometric information include, but are not limited to, a fingerprint or hand scan, a retina or iris scan, a voice print, or a facial geometry scan.”
The educational achievement of children in Florida has not been hampered by that and the schools there continue, so it does not have to be like this. We can step back from allowing technology to lead the debate. We can step back from children being normalised into their bodies being used to access school services, and we can move forward with asking where we, as a country, draw the line, and bring forward legislation to show that there is a line. I suggest that the line is the use of biometric technology in schools on young people.
My Lords, I thank my noble friend Lord Clement-Jones for securing this important debate on a topic that has shocked the public and caused widespread concern and alarm. I also declare my interest as chair of Big Brother Watch.
It is hard to know where to start on the use of facial recognition technology to administer something as mundane as payment for school meals. Deploying airport-style security methods to ensure that a hungry child is paying for their lunch is such obvious overkill that it would be funny—if the implications were not so serious. As Fraser Sampson, the biometrics commissioner for England and Wales, said, just because schools can use the technology does not mean that they should. There are plenty of less intrusive and less risky ways to do the same task that are already in use in many schools.
Introducing facial recognition technology brings schools into the realm of data protection law, under which any processing must be lawful, transparent and fair. This means that a school would need to consider, in a structured analysis, whether the use of such technology is a proportionate measure to achieve the aims it seeks to achieve, or whether the interference with the child’s rights is of a level that renders the use of the technology unacceptable. I can only assume that, in the cases of the schools that have adopted this technology, this analysis was not done, or was not done properly, because the answer is so obviously that it is not proportionate.
That is particularly the case when we remember that the GDPR stresses that children merit special protection when it comes to their data. By law, children do have the right to refuse to participate in the use of intrusive technologies, and their wishes override those of their parents. In that case, the school must put in place reasonable alternatives which would presumably negate the claimed efficiency benefits of the new system.
I should also point out that the facial recognition systems being installed in schools reportedly cost £12,000 and then £3,000 a year. Would that money not be better spent on free school meals in the holidays, which the Government seem to have so much trouble funding?
I also have a wider concern. The use of biometric systems to police something as trivial as payment for school meals is training our children to accept that their private data is not theirs to be kept private and protected. As Silkie Carlo, director of Big Brother Watch, says:
“We are supposed to live in a democracy, not a security state. This is highly sensitive, personal data that children should be taught to protect, not to give away on a whim … there are some red flags here for us.”
The data protection principles that my noble friend Lord Clement-Jones has spoken of—consent, proportionality and safeguards around data storage and sharing—all derive from the GDPR, which is broadly incorporated into UK law through the Data Protection Act 2018. Now that we have left the EU, the Government are seeking to overhaul our data protection framework and water down citizens’ rights, encouraging institutions and businesses to use AI tools such as facial recognition and personal data such as facial images, with substandard protections compared with those of our neighbours. They even want to do away with the Biometrics and Surveillance Camera Commissioner, who oversees the uses of this technology. So my first question to the Minister is: would it be easier or harder for schools or data-gathering companies to take children’s sensitive biometric data out of the Government’s forthcoming attack on UK GDPR?
My noble friend Lord Clement-Jones also referenced the police’s use of live facial recognition, which has been going on for five years now with Home Office funding and the Mayor of London’s blessing, despite there being no explicit legal basis and no parliamentary scrutiny. In addition, there has been a judgment in the challenge brought by the Liberal Democrat councillor Dr Ed Bridges, finding that South Wales Police’s use of live facial recognition had been unlawful because appropriate safeguards were not in place. Another factor was the well-documented problems with the technology’s race and sex bias, which has not been appropriately explored and mitigated.
Here is another area where the Government’s reckless attitude to new technologies, rights and liberties has impacted on the rights of children. Civil liberties group Big Brother Watch, which I chair, observed a Metropolitan Police trial of live facial recognition that resulted in an innocent 14 year-old black schoolboy walking home in his school uniform being accosted by four plain-clothes police officers. He was pulled into a side street, held up against a wall and asked for his ID, fingerprints and phone. Of course it was another case of mistaken identity, as is the case in 93% of all facial recognition so-called matches generated by the Metropolitan Police. This unforgivable incident could easily traumatise a child.
This dangerously authoritarian technology diminishes trust in the police and other public authorities at a time when it is already very low, and it makes Britain less of a free country to live in. So my second and final question to the Minister is: will the Government bring forward legislation to impose an urgent moratorium on public authorities’ use of live facial recognition technology in order to give Parliament an opportunity to properly assess it before any further harm is done?
My Lords, we should all be grateful to the noble Lord, Lord Clement-Jones, for introducing this important subject for debate. It is certainly timely but, although I share many of the concerns expressed by the noble Lord and others in this debate, I do not quite subscribe to his fears that this could be a step towards a surveillance state.
With schools beginning to investigate the possible use of biometric recognition technology, it is important that the Government make their position clear. This is not an area with which the Government are unfamiliar; the Department for Education issued 12 pages of advice as long ago as March 2018. Prior to this debate I was not aware of that, so it has not been given much publicity. I note that the document is termed “advice” rather than “guidance”. I do not know what the difference is but it seems to be a downgrade from guidance, and I think it is appropriate to ask the Minister to explain what she understands the difference between the two to be, if indeed there is one.
Publicity has been attracted to the introduction by a small number of schools in Ayrshire of facial recognition technology. Last week, it was announced that they had paused their use of it following concerns expressed by the Information Commissioner’s Office. At the same time, I understand that a school in Greater Manchester has decided to abandon its planned rollout of a facial recognition system. It is not difficult to understand the rationale advanced by the company that supplied and installed systems in schools in Ayrshire: that facial recognition technology can speed up the delivery of school lunches. However, it might have been thought that simply staggering lunch-breaks could have been equally effective, if that was the main aim.
The National Education Union says no concerns have been brought to its notice thus far but that the overview of biometric facial recognition is the same as that concerning the use of fingerprint technology in schools, which is primarily around consultation and consent. However, I suspect that one difference is that children are now familiar with using fingerprint technology to access their smartphones so it is not perceived as being intrusive in the way that facial recognition often is.
There is also the issue of the security of information once it has been taken and is then stored. As noble Lords may have seen in recent news reports, we in the Labour Party have received a painful reminder in the past week that sensitive information can be illegally accessed by malign forces even when it is assumed to be held securely. So wider worries in that regard over biometric data need to be addressed.
There is general acceptance of the growing and practical uses to which biometric technology may be put, but further concerns exist over what that technology actually involves. It is important to differentiate between facial recognition technology, which appears to be what was trialled in the schools in Scotland, and live facial recognition, to which the noble Lord, Lord Clement-Jones, referred, which is altogether more sinister. Whereas facial recognition technology involves a single process where the individual concerned is aware of the process and has consented to it, or consent has been given on the person’s behalf, live facial recognition is typically directed surreptitiously towards groups of people to identify individuals indiscriminately. We understand that the latter sort of system has been used against protesters in Hong Kong, and it is possible that widespread deployment should be a matter of grave concern.
Fortunately, that is not what we are talking about today. It is a matter for each school governing body to determine whether facial recognition technology should be used in their school, although I suspect that the recent experiences mean that we do not get many more schools seeking to push that boat out at the moment, at least until the Information Commissioner’s Office has issued further pronouncements. However, the DfE’s 2018 advice notes quite rightly that:
“There are no circumstances in which a school or college can lawfully process a pupil’s biometric data without having notified each parent of a child and received the necessary consent.”
For the most part, the advice appears reassuring to pupils and parents, but one issue that may not meet the latter criterion concerns the section headed, “The pupil’s right to refuse”. This makes the legal position clear, stating that:
“A pupil’s objection or refusal overrides any parental consent to the processing.”
This is an issue that has arisen recently in another context, regarding the offer of Covid vaccinations to children aged 12 and above. However, the difference between Covid vaccination and the use of biometric data is that the current minimum age for the former is 12 but there appears to be no minimum age for the latter in the advice issued in 2018, which suggests, at least in theory, that biometric data could be applied to children as young as four in reception year. I do not believe for one moment that that would happen, but there is no lower limit. I hope the Minister is able to clarify the position because I am sure I would not be alone in my concern that there may be no age at which a child would be deemed to be too young. That age should not be lower than 12, which was mentioned earlier in relation to Covid vaccinations.
The guidance also has a section, under the heading “Notification and parental consent”, concerning looked-after children. It would be helpful to have clarity from the Minister on the position of a child’s carer, whether or not it is a local authority. Would a birth parent have the right to object while their child had looked-after status?
Finally, the advice document states that it will be kept under review and updated as necessary. I feel sure that the Minister will agree that the speed at which artificial intelligence advances requires such an update, approaching four years after the advice was issued.
It is unrealistic to believe that biometric recognition technology can be delayed for long, but it must surely be subject to assurances that individual privacy will not be undermined and that consent in all circumstances must be received before it is introduced. I suspect that this is just one stage on a journey towards artificial intelligence assuming functions that have hitherto relied on human intelligence and consequent actions. That is a journey that in many ways is rather scary to contemplate, but it must be subject to the checks and balances that I have referred to. We know that the Government are planning a White Paper on AI governance, and I hope that the Minister will be able to say when it is likely to appear, as it will be necessary to begin to allay the fears that noble Lords have rightly outlined in this debate.
My Lords, I am grateful to the noble Lord, Lord Clement-Jones, for bringing to the House’s attention the important matter of facial and other biometric recognition technologies in schools. He says that I am versatile, but I think that he is versatile. I have been a Whip in many departments and I always seem to be answering his questions, whatever department I am in, so I think we are both the same in that regard. I also thank noble Lords who have given me notice of what they were going to bring up today; I cannot tell you how helpful that is.
There are differing views regarding use of this technology in schools and indeed across all aspects of society. The Government recognise the need for care and for checks and balances in a system where personal and sensitive information is used to enable pupils and, indeed, any citizen to undertake everyday activities, such as children paying for lunch or accessing the library. Therefore, the Government recognise—and the noble Lord, Lord Clement-Jones, mentioned—that this is a complex and challenging policy area.
My noble friend Lord Vaizey and the noble Baroness, Lady Falkner of Margravine, said that live facial recognition has quite a lot of inaccuracies. Certainly, the accuracy of any technique will depend on the technology and how it is used. Based on LFR trials, at worst there is a one in 1,000 chance of a false alert, and around a 70% chance of a true alert, if someone on a watchlist passes a camera. However, there can still be false alerts, which is why a human being always takes the final decision to engage with an individual match via the technology.
The Department for Education sets out in its non-statutory guidance, titled Protection of Biometric Information of Children in Schools and Colleges, information for schools and colleges if they wish to use personal information about pupils for the purposes of using automated biometric recognition systems. This guidance covers legal duties under the Protection of Freedoms Act 2012 in relation to the processing of biometric information in schools. It also covers the data protection regime. This debate has highlighted that the Department for Education’s guidance needs to be updated, and will be updated imminently, to refer to the most current UK data protection legislation, which is now the UK general data protection regulation or UK GDPR, and the Data Protection Act 2018.
The decision to use biometric technology rests entirely with individual schools, which are legally responsible, as per the GDPR, the Protection of Freedoms Act and Data Protection Act, for any data they gather and use. As such, the department believes that, if a school wishes to introduce biometric technology, it is rightly a decision for an individual school to make, based on its own operational needs and in consultation with its staff, pupils, parents and carers—and, importantly, having regard to among other things the relevant data protection law. We do not intend changing this fundamental principle of school autonomy on this matter.
Schools wishing to introduce biometric technology for pupils to access services, such as the purchase of school meals, must follow their legal responsibilities. This will include the recognition that processing biometric data for uniquely identifying a natural person is classed as a special category of personal data. This means that any school—the data controller—wishing to adopt biometric technology must ensure that their data protection impact assessment demonstrates that the processing of any personal data is lawful and meets the conditions for special categories. As stated in Article 9 of the UK GDPR, together with Schedule 1 to the Data Protection Act, the rules around sensitive processing as part of the DPA 2018 would still apply when facial images are used as biometrics—that is, they have been used in an identification process, such as via automated facial recognition.
The departmental guidelines highlight the requirement to obtain the appropriate consent from parents of children under 18, and set out the individual right of a parent and/or child to refuse consent to using biometric technology. Except in certain limited circumstances, a school or college can lawfully process a pupil’s biometric data only if it has notified each parent of a child of the intention to do so and received the necessary consent. There are exceptions to when a parent needs to consent and, in those cases, a person who cares for the child, or another body such as a local authority, needs to provide consent instead. The child themselves can object to the use of their biometric information and, if that happens, the information must not be processed even if a parent has consented.
The noble Lord, Lord Watson, asked about the age limit of children giving consent. Under Sections 26 and 27 of the Protection of Freedoms Act, there is no reference to a lower age limit in terms of a child’s right to refuse to participate in sharing their biometric data. Under the legislation, we are unable to remove or limit the right of any child to refuse consent to sharing their biometric data. On the question of who can give consent if it is not the parent, and what legal autonomy they have, I say that when a child is looked after and is subject to a care order in favour of the local authority, or the local authority provides accommodation for the child within the definition of Section 22 of the Children Act 1989, a school would not be required to notify or seek consent from the parents. I hope that covers the noble Lord’s questions.
Schools must find a reasonable alternative means for any pupils who opt out of using an automated biometric recognition system to access services. This is an especially important point: pupils should not be disadvantaged or receive access to fewer or different services because the school introduces biometrics. Several noble Lords asked whether it was a waste of time if schools are using two different systems. I think that we have seen that in several schools, which have stopped using this system because they find that they have to do a risk analysis, which has to be consent-based, and having the two systems can be difficult—because, if you are using facial recognition and some pupils do not want to use it, they still have to use the old system. As noble Lords pointed out, why in that case do not they just use the old system in the first place? There is still a long way to go here.
Schools—the data controller—must make sure that any biometric data is stored securely, is not kept longer than needed, is used only for the purpose for which it is obtained, and is not unlawfully disclosed to a third party. Any failure in meeting these requirements could result in referral to the Information Commissioner’s Office, which will take steps to understand any data breach, work with schools to address any failures and agree measures to help them to meet their legal requirements. In serious cases, enforcement notices may include an absolute ban on the processing of personal data. The Department for Education will continue to remind schools of their legal position in terms of the law and their duties within it, and provide an update to the published advice.
In deciding to implement this technology, each school should monitor and review the biometric technology’s effectiveness against its original purpose. Clearly, it is right that this must be a matter for individual schools. This action will ensure that the technology continues to be used for the reason it was intended and that it meets the legal duties, requirements and responsibilities under the Data Protection Act, UK GDPR and the Protection of Freedoms Act. As this is a decision for each school, the department would have no purpose to collect or store data related to a school’s use of biometric technology. There is no intention for this approach to change. One of the primary drivers for the department not intervening in this space is the broader legal framework and the checks and balances already in place.
The Information Commissioner’s Office is now one of the most important regulators in the UK, as noble Lords are aware, responsible for supervising and enforcing the application of data protection legislation across almost every organisation in the country. With the adoption of new systems comes the responsibility to make sure that data protection obligations are fulfilled and customers’ privacy rights addressed alongside any organisational benefit.
The Information Commissioner’s Office also recently provided, in June, its opinion on the use of live facial recognition technology in public places, with recommendations and next steps for data controllers. The department’s guidance for schools, when updated, will reflect the latest advice from the Information Commissioner’s Office on this important matter. The department is confident that schools have the support needed from the Information Commissioner’s Office to ensure they meet data protection standards, especially as schools adopt biometric technology for pupils to access services.
In line with any changes as a consequence of the ongoing Department for Digital, Culture, Media and Sport consultation on data protection reform, which ends on 19 November, the Department for Education will update its current non-statutory guidance for schools. It will also update it to reflect any changes to the legal frameworks. The reform seeks to create a new, ambitious, pro-growth and innovation-friendly data protection regime that underpins the trustworthy use of data for a better UK data rights regime—sorry, that sounds a bit like an advertisement.
I am sure the consideration of legislative changes will have been discussed, but at present there is no specific intention to introduce general legislation for use of biometric data in schools or society in general. However, as has been shown today by all noble Lords, this is such a fast-moving area; I cannot believe it will not be discussed at great length as far as legislation is concerned. All the concerns brought up today are very live and important and need a great deal of thought. I will take this back to the Department for Education, but it is the Department for Digital, Culture, Media and Sport which really needs to get involved in this. I think everyone is almost wondering what is coming next.
I hope I have given some answers to noble Lords’ concerns and thank them for all their helpful contributions to this debate. I look forward to working with noble Lords towards the Government’s aim to deliver data reforms in the future that will be forward-thinking and innovative and seek to maintain public trust and confidence in the responsible use of all data, including biometrics.