All 2 Debates between Chi Onwurah and Darren Jones

Oral Answers to Questions

Debate between Chi Onwurah and Darren Jones
Tuesday 3rd September 2024

(3 months, 3 weeks ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Darren Jones Portrait Darren Jones
- View Speech - Hansard - - - Excerpts

The issue of so-called hope value was referenced in the Labour party’s manifesto, and the Government will set out further detail in due course.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central and West) (Lab)
- Hansard - -

3. What fiscal steps she is taking with Cabinet colleagues to encourage inward investment.

Facial Recognition and the Biometrics Strategy

Debate between Chi Onwurah and Darren Jones
Wednesday 1st May 2019

(5 years, 7 months ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Darren Jones Portrait Darren Jones (Bristol North West) (Lab)
- Hansard - - - Excerpts

I beg to move,

That this House has considered facial recognition and the biometrics strategy.

It is a pleasure to serve under your chairmanship, Sir Roger. First, I must declare my interests, which are not directly in the subject but in the privacy and data protection space in which I practise as a lawyer, as set out in the Register of Members’ Financial Interests. I chair various technology all-party parliamentary groups and Labour Digital. I am also a member of the Science and Technology Committee, which has an ongoing inquiry into the subject. We have taken evidence from Professor Paul Wiles, the Biometrics Commissioner, and Baroness Williams of Trafford, the Minister in the other place. Some hon. Members have sent their apologies, which I entirely understand, because we are competing with the climate change debate in the main Chamber.

Why did the subject first come to my attention? As a consumer, I have become increasingly used to using facial recognition technology, whether I have proactively agreed to it or not. I often forget my passwords these days, because I use my face to pay for things and open my iPad and phone, although as I was saying to my hon. Friend the Member for Sheffield, Heeley (Louise Haigh), that can prove tricky when I am trying to pay for things at a distance. For many of us, facial recognition technology provides consumer services on Facebook and Google by auto-tagging friends and family members and allowing us to search our images. There is an entire debate to be had about consent, transparency and privacy in the use of such technologies in the private sector, but my focus today is on the role of the state, the police and the security services in the use of facial recognition technology.

Facial recognition technology is beginning to be used more widely. It is well known to those who take an interest in it that the South Wales police has used it at sporting events; that the Metropolitan police has trialled auto-facial recognition technology on many occasions, including at events such as the Notting Hill carnival and Remembrance Sunday, and at transport hubs such as Stratford; and that Leicestershire police has used it at the Download music festival. I am concerned that it is also being used at public protests, although perhaps I understand why; I will come on to that later in relation to our freedom of association.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - -

I congratulate my hon. Friend on securing this debate on a key subject. He has spoken light-heartedly about the competition with the climate change debate. Does he agree that in some ways, as with climate change, although only a small number of issues are currently associated with this topic, the range of impacts that facial recognition technology will have on our society and economy, on the way we work and do business, and on our trust relationships will be huge and will grow over time?

--- Later in debate ---
Chi Onwurah Portrait Chi Onwurah
- Hansard - -

I thank my hon. Friend for giving way to me again. He has made some very important points about the way in which this technology is already being used by Facebook and others, but is it not the case that, however advanced the technology is, it has also been found that it can be biased because of the training data that has been used, which means that particularly those from minorities or specific groups are not recognised adequately? Does he agree that it is all the more important that there is investment as well as transparency in the police database, so that we can ensure that groups who are already marginalised in many ways, particularly with regard to police services, are not once again being discriminated against?

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

Unsurprisingly, I agree entirely. This is part of a much broader conversation about designing technology with ethics at the very start, not only in facial recognition but in algorithmic decision making and a host of different areas where we have seen that human biases have been hardwired into automated decision processes that are delivered through technological solutions.

The Government have a really important role to play here, not just in setting the regulatory framework and building on, and really giving strength and resource to, the Centre for Data Ethics and Innovation to set the national and international tone, but through their procurement of services. They must say, “We have got to get this technology right. We are going to buy these systems, but we really must see this ethics by design right from the very beginning, dealing with biases in a way that allows us to avoid biased solutions.” That would stimulate the market to ensure that it delivered on that basis.

On the legal basis for biometrics, older forms of biometrics such as DNA and fingerprints have a legal framework around them; they have guidance and rules about how they can be used, stored and processed. There is no specific law relating to facial recognition and no specific policy from the Home Office on it. The police forces that are trialling these systems say that they are using existing legislation to give them the legal basis on which to perform those trials, but the fact of the matter is that we only need to look at the dates of that legislation to see that those laws were put in place way before the technology came into existence or before it reached the maturity that we are seeing today.

There was some debate during the passage of the Data Protection Act 2018, when I, my hon. Friend the Member for Sheffield, Heeley and others served on the Committee scrutinising that Bill, but there was no specific discussion during that process or any specific regulation arising from it about facial recognition technology. If police are relying on the Police and Criminal Evidence Act 1984—perhaps there is an irony in the date of that legislation—the basis and the understanding of the technology did not exist at that time, so it is not in that legislation. Even the Protection of Freedoms Act 2012 is too old. The definition of biometrics in that legislation cannot encapsulate a proper understanding of the use, sensitivity and application of automatic facial recognition.

I am not alone in saying this—indeed, it seems to be the view of everybody but the Government. The Information Commissioner has opened investigations; the independent biometrics and forensics ethics group for facial recognition, which advises the Home Office, agrees with me; the London Policing Ethics Panel agrees with me; the independent Biometrics Commissioner agrees with me; and, perhaps unsurprisingly, civil liberties groups such as Liberty and Big Brother Watch not only agree with me but are involved in legal action against various police forces to challenge the legal basis on which these biometrics trials are being conducted. When he responds, will the Minister say that the Government now agree with everybody else, or that they continue to disagree with everybody else and think that this situation is okay?

I will now address the second part of this debate, which is the biometrics strategy. I focused on facial recognition because it is a particularly timely and sensitive component of a broader biometrics strategy. All of us who use technology in our daily lives know that biometric markers and data can be used to identify our location, identity and communications. That means that the Government and, indeed, the private sector can access data and learn things about us, and that area of technology is growing. People are rightly concerned about ensuring that the right checks and balances are in place. It is one thing for an individual to agree to facial recognition technology in order to unlock their tablet or phone, having read, I hope, about what happens to their data. It is another thing, however, for them not to be given the opportunity to give their consent, or not to receive a service and therefore not know about it, when the state is using the same types of technology.

The biometrics strategy needs to get into the detail. It needs to set out not only what is happening now but what is envisaged will happen in the future and what the Government plan to do about it, in order to protect civil liberties and inform citizens about how the data is being used. Clearly, they would not be informed individually—there is no point in telling a terrorist planning an incident that there will be a camera—but the right balance can be achieved.

Again, I do not understand why the Government are so slow in responding to these fundamental issues. It is so long since the 2012 High Court ruling on the retention of custody images, and we had to wait five years for the biometrics strategy. Imagine how much the biometrics sector in this country changed during those five years. Perhaps the Government were trying to keep up with the pace of change in the technology space, but the strategy was long delayed and long awaited.

Given my tone, Sir Roger, you will not be surprised to hear that everyone was very disappointed with the biometrics strategy, because it merely gave a kind of literature review of current uses of biometric data. There was a little bit about the plans for a new platform, which the Home Office is building, regarding how different people access biometric data. It said nothing at all, however, about the future use, collection and storage of biometric data, or about data protection. It said nothing about the Government’s own use and collection of data; the need for enforceable guidelines to enable devolved decision making by, for instance, police forces across the country; how different Departments might be able to use different forms of biometric data across Government, which, evidently, is very easy to deliver with today’s technology; or how the data would be stored securely.

People are concerned about cyber-security and breaches of their personal data, so what steps will the Government take in this developing space? Where will the data be stored? In advance of this debate, I received representations arguing that we should not send it to companies overseas and that it should be stored in the UK. One would think that the biometrics strategy addressed those issues, but it does not. Is the beta version of the biometrics strategy due soon, or does the Minister think that the Government have provided a sufficient response on this important field?

I do not want to keep saying that everybody agrees with me, because that would be a little uncomfortable, but there is no denying that the Biometrics Commissioner, the Surveillance Camera Commissioner and the Information Commissioner’s Office have all said exactly the same thing—this biometrics strategy is not fit for purpose and needs to be done again. The Government need to be clearer and more transparent about their endeavours and make that clear to the public, not least because these areas of technology move at pace. I understand entirely why police forces, civil servants or others want to be able to take the opportunities to deliver their services more efficiently, more effectively and with more impact—we support that—but the right checks and balances must be in place.

I will touch on our fundamental rights and freedoms, because that debate does not get enough air time in the technology space. Our freedoms are increasingly being challenged, whether the issue is cyber-defence or how we regulate the online world, and also in this space. Fundamental freedoms—freedoms that we hold, or purport to hold, dear—are encapsulated in the European convention on human rights and the Human Rights Act 1998. They go to the very nature of this technology, such as the right to a private life that can only be interfered with for a legitimate aim and only if that interference is done proportionately. Scanning a load of people going about their day-to-day life does not feel proportionate to me, and there is no accountability to make sure that it is being done legitimately. As my hon. Friend the Member for Newcastle upon Tyne Central (Chi Onwurah) said, if the selections that those technologies pick up are resulting in false matches or are discriminating, primarily against women and people from ethnic minority backgrounds, that also ought to be considered.

Those freedoms also include freedom of expression and of association. In public protests in recent weeks, people who dearly hold certain views have gone too far by moving away from their right to freedom of expression and to peaceful demonstration, towards criminal activity, intimidation or hostility. We should set the tone and say that that is not welcome or acceptable in our country, because having a right also means having a responsibility to use it wisely. Of course we want to protect those who want to demonstrate through peaceful public protests.

I am sure the public will say—this lies at the heart of my contribution—“Fine. Use some of this technology to keep us safe, but what is the right balance? Do we understand how it is being used? What are the accountability measures? What rules and guidance are being put down by the Government, on behalf of Parliament and the people, to make sure this is being done in a way that is not a slippery slope towards something we ought not to be doing?” We need a wider debate in public life about how we protect freedoms in this new digital age, and this issue is an example of that.

The House of Commons digital engagement programme is often a very good process for Westminster Hall debates, as it allows the public to be part of the conversation and to submit their comments. It would be remiss of me to not point out that some members of the public highlighted a certain irony in the fact that this debate was being promoted on Facebook, so I have shared their concerns, but that is still a medium through which the public like to engage in debate. Hundreds of thousands of people engaged across different platforms—way more than I was expecting—which shows the level of public interest in the use of these technologies.

As might be expected, there were two sides to the argument. The minority view on the platforms was, “I have nothing to hide. Please go out and keep us safe. Crack on, use it.” The other side said, “Actually, this is a slippery slope. I don’t know how this is used, and I’m worried about it. Why can’t I go about my day-to-day life without the police or the state surveilling me?”

I will share some of the comments. On the first side of the argument was Roy. I do not know where he is from. I wish his location had been given, because I could have said, “Roy from Sheffield”. He said:

“No objection. I’ve nothing to hide and don’t find it scary or objectionable for ‘the state’ to be able to track my movements. They already can if I’m in a car”—

I did not know that—

“and that doesn’t seem to be a problem. The added security of the police being able to track potential terrorists far outweighs any quibbles about reduced privacy.”

That is a perfectly legitimate view.

Karyn said:

“Having seen the numbers of crimes solved and even prevented by CCTV I have no objections. Today we have to be realistic, with phones listening in on conversations for marketing and plotting where we are, this is small price to pay for public safety and if you have done nothing there is nothing to fear.”

That is an interesting contribution on what is happening in the private and state sectors. We need to be much more advanced in both spheres.

That was a minority view, however. I do not have the percentage, but the bulk of comments came from people who are concerned. Chris Wylie, who many of us will have read about—he was the Cambridge Analytica whistle- blower, so he clearly knows something about these issues —was firm:

“No. Normalising this kind of indiscriminate surveillance undermines the presumption of innocence.”

We should pause on that, because it is really important. Why should we be tracked and surveilled by the police on the assumption that we might be guilty of something? That does not feel right, just as it does not feel right that people have to prove their innocence to get their images taken off a police database. Chris went on to say:

“It should never be up to us as citizens to prove we are not criminals. Police should only interfere with our lives where they have a reasonable suspicion and just cause to do so.”

I share Chris’s views.

Andrea said that this was a slippery slope:

“The idea that some people have about privacy as an exclusive issue for the bad guys is completely wrong. Not only privacy prevents my acts from limiting my rights but also avoids an unjustified use of power by the Gov’t.”

Again, we should pause there. It is our job in Parliament to hold the Government to account, yet we have no strategy, legislation or rules to enable us to do so. That is a fundamental problem. She goes on to say:

“Such a huge involvement of disturbing tech could lead to a 1984-like slippery slope, one which none of us wants to fall in, regardless of their legal background.”

Jenny said:

“I believe that this would suppress people’s ability to engage in public demonstrations and activities that challenge the government, which is hugely dangerous to democracy.”

A lot of people said that if they thought the state was scanning their data and putting it on a database, they might not associate with or take part in public demonstrations. If that were to happen, it would represent a significant diminution of our democratic processes.

Lastly, Bob said:

“It makes it easier for a future, less liberal government to monitor the activity of dissident citizens. During the miners strike in the 1980s miners were stopped from travelling just on the suspicion they would attend rallies based on their home locations and where they were heading. How would this technology be applied in the future for, say, an extinction rebellion march?”

Regardless of our political disagreements across the House, none of us thinks that the state is overreaching in a way that many other countries would. However, given the lack of legislation, guidance and regulation to enable us to hold the Government to account, and with independent commissioners and regulators saying that this is not good enough, I agree with Bob. There is a huge risk in not putting in place a framework with the appropriate checks, balances and protections, not just because that is the right and important thing to do today, but because we need that framework for future Governments.

Chi Onwurah Portrait Chi Onwurah
- Hansard - -

My hon. Friend is being very generous with his time, and I congratulate him again on having raised this important topic. Does he agree, as I think he is suggesting, that the level of interest in this debate—demonstrated by the quotes he has read out—shows that technology such as facial recognition, as well as algorithms and data, needs to be publicly debated? We can make a choice as to how it is used, so that citizens are empowered. Technology should not be something that is done to people; they should have rights and controls as to how it is enacted.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

My hon. Friend is absolutely right. The debate is a broader one about technology. How do we engage the public with these issues? I am an evangelist for technological reform, although I will not go on about that topic for too long, because it is not linked to the title of the debate. In my view, the idea that we can increase our economy’s productivity, increase wages, transform people’s working lives and reform and make more efficient our public services without using technology does not make sense. As my hon. Friend says, however, we have to do that in the right way and bring the public with us.

On a cross-party basis, we share the belief that we need to take crime seriously, and to address the increasingly sophisticated methods that criminals and terrorists may employ when trying to commit crimes or terror in our country. However, we must get the balance right, and there is a lacuna of regulation in this space. There are no legal bases, there is no oversight, and as a consequence there are no protections. That is why the Government should act now.