Visa Processing Algorithms Debate

Full Debate: Read Full Debate
Department: Home Office

Visa Processing Algorithms

Chi Onwurah Excerpts
Wednesday 19th June 2019

(4 years, 9 months ago)

Commons Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - -

This is an important debate about technology, automation, the Home Office, immigration and people’s lives. I came to the House in 2010 and have since often raised issues to do with technology, and I also feel that a better debate on immigration has often been needed, so the opportunity to spend two hours and 20 minutes debating this subject is an unexpected but welcome surprise. However, I do not intend to detain the House for much longer than the half hour originally estimated, although I will be happy if other Members wish to.

I want to start by saying that I am happy to call myself a “tech evangelist”, having worked as an engineer in the tech sector for 20 years before coming into Parliament. Since then, I have worked to champion technology and how it can make all our lives better; I was the first MP to mention the internet of things in this place, for example. Over the years, I have also raised concerns about the impact of technology, especially with a Government who refuse to put in place a regulatory framework that reflects its potential for harm as well as good, and who, critically, refuse to accept that the impact of technology on society is a political choice.

Along with others, I have been highlighting the potential harms of algorithmic decision making, artificial intelligence and data exploitation for years, yet the Government have done nothing. In fact, we now learn that they have done worse than nothing: they have taken advantage of the current regulatory chaos to implement algorithmic management in secret.

On 9 June, the Financial Times revealed that the Home Office was secretly using algorithms to process visa applications, which is making a bad situation worse. I say that because of my experience as a constituency MP in Newcastle with a significant level of immigration casework—I will talk more about that. I am also chair of the all-party parliamentary group on Africa. We are currently conducting an inquiry into UK visa refusals for African visitors to the UK. We have met the Minister—we are grateful for that—and our report will be published next month. Furthermore, I am chair of the all-party parliamentary group on diversity and inclusion in science, technology, engineering and maths; algorithmic bias is one important example of how the lack of diversity in STEM is bad for tech and society.

According to the Financial Times journalist Helen Warrell, the Home Office uses an algorithm to “stream” visa applicants according to their supposed level of risk—grading them red, amber or green. The Home Office says that that decision is then checked by a real-life human and does not impact the decision-making process, which is the most ridiculous justification for algorithmic decision making ever—that it does not make any decisions! Presumably it is just there to look good. We must not forget the inevitability of confirmation bias in human decision making, which was raised by the chief inspector of borders and immigration.

The Home Office refuses to give any details of the streaming process, how risk is determined or the algorithm itself. That lack of accountability would be deeply worrying in any Department, but in the Home Office it is entirely unacceptable, particularly when it comes to visa processing. The Home Office is broken. We know that it is unable to fulfil its basic visa-processing duties in a timely or consistent manner. If we add to that a powerful and unregulated new technology, Brexit and bias, we have a recipe for disaster.

I know that there are many able and hard-working civil servants in the Home Office, though fewer than there were. When I say that the Home Office is broken, it is not a criticism of them, but of the resources they are given to do their job. The all-party parliamentary group for Africa received detailed and, at times, excoriating evidence from a whole range of people and organisations—academics, artists, business owners, scientists and family members—who had been wrongly denied entry to the UK. I will give just a few examples.

LIFT, the world-famous London International Festival of Theatre, applied for visas for well-known artists from the Democratic Republic of the Congo for a performance exploring their experience of civil war. They were denied visas on the basis that UK dancers could perform those roles. We also heard from the Scotland Malawi Partnership, which highlighted a case where a high-profile musician invited to the UK from Malawi was given a visa rejection letter from UK Visas and Immigration that essentially stated, “We reject your visa because [insert reason here].”

Patrick Grady Portrait Patrick Grady (Glasgow North) (SNP)
- Hansard - - - Excerpts

I thank the hon. Lady for giving way and wholeheartedly endorse everything she is saying. We have worked closely together. I chair the all-party parliamentary group on Malawi and assist her on the APPG for Africa. As she says, these examples are just the tip of the iceberg. She is right that we should not blame the individual decision makers in the Home Office. It is the policy, the lack of resourcing and, as I think she is getting to, the increasingly broad-brush approach to the use of automation. This is damaging the whole of the UK and everything the Government say about wanting to make Britain a great country to come to; that simply will not be the case if people cannot get through the door.

Chi Onwurah Portrait Chi Onwurah
- Hansard - -

I thank the hon. Gentleman for his intervention. Unsurprisingly, as we have worked together in the all-party parliamentary groups, I agree with everything he said. In fact, he anticipates some of the points that I will come on to make.

Our APPG also heard of ordained ministers and priests being denied visas either because they did not earn enough—as if they had taken a vow of poverty—or because the Church of England is not considered a reputable sponsor. We heard of a son unable to reach his father’s deathbed and grandparents unable to see their grandchildren.

Jim Cunningham Portrait Mr Jim Cunningham (Coventry South) (Lab)
- Hansard - - - Excerpts

I have seen similar cases, particularly when somebody wants to bring a member of their family over here. I will not go into great detail, but I had a case where an individual was dying of cancer, which meant that her husband would have to give up his job to look after their four kids. The problem was trying to get somebody from her home country to come here to look after her until she died. It took a long time for us to sort that out, but eventually they were allowed a visa to come here. Nine times out of 10 with visas or even leave to stay, there are major problems with the Home Office. My hon. Friend is right; something has to happen. The Home Office is under-resourced and has a lack of personnel. It might tell us that it can put an application through in a given time, but it does not happen that way. People often turn up at our surgeries, and they are sometimes very distressed about the way these things are handled.

Chi Onwurah Portrait Chi Onwurah
- Hansard - -

I really thank my hon. Friend for that intervention, because he is of course absolutely right. He raises a heartbreaking case, but he also hints at the fact that, as a consequence, we as MPs are seeing more casework and having a higher case load. That in itself is putting more pressure on the Home Office because we raise cases and ask for them to be reviewed. It takes longer to effect a decision—a final, just decision—and the people concerned have their lives disrupted, in some cases heartbreakingly so, for a longer period of time.

I want to mention the case of a United Kingdom mayor who was denied the presence of their sister at their inauguration, presumably because they were not considered to be a credible sponsor. Finally of these national cases, Oxfam has highlighted that, because of visa rejections, only one of the 25 individuals from Africa expected to attend a blog-writing training course at the recent London School of Economics Africa summit was able to do so. Non-governmental organisations and so on are trying to support in-country skills development, but it is often the case that it is very difficult to bring people, particularly young people, working for Oxfam or other NGOs to this country for training.

The Minister should know that her Department is notorious for a culture of disbelief, with an assumption that visitors are not genuine. I will give one example from my own constituency. Last year, the University of Nigeria Alumni Association UK branch chose to hold its annual meeting in Newcastle—by the way, it is a fantastic location to hold all such events—but a significant number were initially denied visas on the grounds that they might not return to Nigeria. These were all businessmen and women, academics or Government workers with family in Nigeria. After my intervention, their visas were approved, but that should not have been necessary.

Entry clearance officers are set independent targets of up to 60 case decisions each day, and our all-party group investigation found that this impacted on the quality and fairness of decision making. Home Office statistics from September 2018 show that African applicants are refused UK visas at twice the rate of those from any other part of the world. When visitors are denied entry arbitrarily, the UK’s relationship and standing with those countries is damaged, as has been mentioned, and we lose culturally and economically. International conferences and events, new businesses, trading opportunities and cultural collaborations are being lost to the UK because of the failings of the Home Office.

The last report on visa services from the independent chief inspector in 2014 found that over 40% of refusal notices were

“not balanced, and failed to show that consideration had been given to both positive and negative evidence.”

Last month, it was announced that the six-month target for deciding straightforward asylum cases is being abandoned. This was a target that, as the Home Office’s own statistics show, was repeatedly missed. In 2017, one in four asylum cases was not decided within six months, while immigration delays have doubled over the past year, despite a drop in cases. As a constituency MP, I know from personal experience about the significantly longer delays to visa applications.

This is a failing system, but it is run for profit. Applicants are routinely charged up to 10 times the actual administrative costs of processing applications. For example, applying for indefinite leave to remain in the UK costs £2,389, while the true cost is just £243. Fees for refused visas are not refunded and there is no right of appeal for the refusal of a visit visa application. Within the process, even communication with the Home Office is monetised: people are charged £5.48 to email the Home Office from abroad and non-UK-based phone calls cost £1.37 per minute.

The fact that the Department has reputedly lost 25% of its headcount under the austerity agenda must be part of the reason for these failures, but there is also the culture of disbelief, which I mentioned earlier, the hostile environment, of which we have heard much, and the impact of Brexit, because what staff do remain are being moved on to Brexit preparation. It is in this environment that the Home Office decided that the answer was an algorithm.

According to the Home Office, the use of algorithms in visa processing is part of an efficiency drive. They are being used not to improve the quality of decision making, but to make up for a lack of resources and/or to drive further resources out. As an engineer, I often say that whatever the problem is, the answer is never technology—at least, not on its own. I will say categorically that algorithms should not be used for short-term cost savings at this stage in their evolution as a technology.

Let me define what we are talking about. An algorithm is a set of instructions, acting on data entered in a particular format, to make a decision. If the algorithm learns from performing those instructions how to make better decisions, that might be called machine learning. If it both learns from performing its instructions and can act upon data in different and unpredictable formats, it might be considered to be artificial intelligence—might, but not necessarily is, because not everything that is artificial is intelligent.

Critically, algorithms are only as good as their design and the data they are trained on. They are designed by software engineers, who tend to come from a very narrow demographic—few are women, from ethnic minorities or working class. The design will necessarily reflect the limits of their backgrounds, unless a significant effort is made for it not to.

There are many examples of problems with the training data for algorithms, from the facial recognition algorithm that identified black people as gorillas because only white people had been used to train it, to the match-making or romantic algorithm that optimised for short-term relationships because the training data showed that they generated more income, due to the repeat business. Unless algorithms are diverse by design, they will be unequal by outcome.

Algorithms are now an integral part of our lives, but without any appropriate regulation. They drive Facebook’s newsfeeds and Google’s search results; they tell us what to buy and when to go to sleep; they tell us who to vote for and whom to hire. However, there is no regulatory framework to protect us from their bias. Companies argue that the results of their algorithms are a mirror to society and are not their responsibility; they say that the outcomes of algorithms are already regulated because the companies that use them have to meet employment and competition law. But a mirror is not the right metaphor; by automating decision making, algorithms industrialise bias. Companies and especially Governments should not rely on algorithms alone to deliver results.

I hope that the Government are not accepting algorithms in their decision making processes without introducing further regulation. The Home Office has denied that the algorithm for visa streaming takes account of race, but it refuses to tell us anything about the algorithm itself. Home Office guidance on the “genuine visitor” test allows consideration of the political, economic and security situation of the country of application, or nationality, as well as statistics on immigration compliance from those in the same geographical region, which can often be proxies for race.

When I announced this debate, many organisations and individuals sent me examples of how Home Office algorithmic decision making had effectively discriminated against them. Concerns were also raised about other automated decision making in the Home Office—for example, the residency checks in the EU settlement scheme, which uses a person’s Her Majesty’s Revenue and Customs and Department for Work and Pensions footprints to establish residency, but does not consider benefits such as working tax credit, child tax credit or child benefit. All those benefits are more likely to be received by women. Therefore, the automated residency check is likely to discriminate against women, particularly vulnerable women without physical documents.

We do not know whether the visa processing algorithm makes similar choices, whether it was written by the same people, or indeed whether it originated in the private sector or the public sector. The Home Office says that algorithmic decisions are still checked by people—a requirement of GDPR, the general data protection regulation—but not how much time is allowed for those checks, and has admitted that the purpose of the algorithm in the first place was to reduce costs.

Unfortunately, the Government’s track record on digital and data does not give confidence. When the Tories and Liberal Democrats entered Government in 2010, big data was a new phenomenon. Now it drives the business model of the internet, but the Government have done nothing to protect citizens beyond implementing mandatory European Union legislation—GDPR. They are happy to preside over a state of utter chaos when it comes to the ownership and control of data, and allow a free-for-all to develop in artificial intelligence, algorithms, the internet of things and blockchain. In 2016, for example, the DWP secretly trialled the payment of benefits using shared ledger or blockchain technology. Despite the privacy implications of using a private company to put sensitive, highly personal data on to a shared ledger that could not be changed or deleted, we still do not know what the process was for approving the use of this technology or the outcome of the trial. The Government should have learned from the Care.data debacle that the misuse of technology damages public trust for a long time.

I like to consider myself as a champion of the power of shared data. I believe the better use of data could not only reduce the costs of public services, saving money to be better used elsewhere, but improve those services, making them more individual, more personal, faster and more efficient. However, I am not the only one to raise concerns. Algorithmic use in the public sector was recently debated in the Lords, where it was estimated that some 53 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment—as in this case—and assistance in decision making. Now that we find it being used in the Home Office, it is essential that the Government—I am glad to see the Minister here today—answer the following questions. I have, I think, 11 questions for the Minister to answer.

Will the Minister say whether this algorithmic visa processing is part of machine learning or artificial intelligence? Is the algorithm diverse by design? Will the Minister say whether the algorithm makes choices about what data is to be considered, as with the settled status check example? Who was responsible for the creation of the algorithm? Was it the Home Office, the Government Digital Service or a private sector company? What rights do visa applicants have with regard to this algorithm and their own data? Do they know it is being used in this way? How long is their data being stored for and what security is it subject to?

What advice was taken in making the decision to introduce this algorithm? Did the Government consult their Centre for Data Ethics and Innovation, the Department for Digital, Culture, Media and Sport or the Cabinet Office? Does the duty of care in the Online Harms White Paper from DCMS apply to the Home Office in this case? What redress or liability do applicants have for decisions that are made in error or are subject to bias by the algorithm? What future algorithms is it planned to introduce into visa processing or elsewhere? Finally, why is it that journalists—in this case, from the Financial Times, as well as Carole Cadwalladr—seem to have identified and brought attention to the misuse of algorithms but the Government or any of their regulators who are supposedly interested in this area, such as Ofcom or the Information Commissioner’s Office, have not? Will the Minister say which regulator she feels is responsible for this area?

A Labour Government would work with industry, local authorities, businesses, citizen groups and other stakeholders to introduce a digital Bill of Rights. This would give people ownership and control over their data and how it is used, helping to break the power of the monopoly tech giants, while ensuring a right to fair and equal treatment by algorithms, algorithmic justice and openness. We need to be able to hold companies and Government accountable for the consequences of the algorithms, artificial intelligence and machine learning that drive their profits or cost-cutting. A Labour Government would protect us not just from private companies, but from the cost-cutting of this Government, who I suspect either do not understand the consequences of their technology choices or do not care.

I hope that the Minister can reassure me and answer my questions and that she can demonstrate that the use of algorithms in the Home Office and elsewhere across Government will be subject to proper transparency, scrutiny and regulation in future.

Caroline Nokes Portrait The Minister for Immigration (Caroline Nokes)
- Hansard - - - Excerpts

I congratulate the hon. Member for Newcastle upon Tyne Central (Chi Onwurah) on securing this debate. I welcome her passionate contribution and recognise the importance of this issue and the sensitivities around it. She described herself as a tech evangelist and she has brought a great deal of knowledge and experience to the House in this debate and with some of the wider issues that she has consistently raised in the House since she arrived in 2010. I hope that the House will forgive me if I spend a bit of time focusing on the wider visa and immigration system before moving on to the specific points that the hon. Lady made, because she raised some wider concerns about the Home Office and the borders and immigration system.

We welcome people from all over the world to visit, study, work and settle here. We welcome their contribution and the fact that Britain is one of the best countries in the world to come and live in. That is why we operate a fair system, under which people can come here, are welcomed and can contribute to this country. However, we need a controlled system: because this is one of the best countries in the world to live in, many people wish to come here. A controlled system, where the rules that make that possible are followed, is what the Government are building and that is certainly what the public expect.

At the end of 2018, we published a White Paper on the future borders and immigration system, which will focus on high skills, welcoming talented and hard-working individuals who will support the UK’s dynamic economy, enabling employers to compete on the world stage. Following its publication, we have initiated an extensive programme of engagement across the UK, and with the EU and international partners, to capture views and ensure that we design a future system that works for the whole United Kingdom.

Just last week, as part of that engagement and as part of London Tech Week, I enjoyed the opportunity to participate in a roundtable with members of Tech Nation, where I was joined by the Minister for Digital and the Creative Industries, my hon. Friend the Member for Stourbridge (Margot James). That occasion is always a great opportunity for Ministers to engage in cross-Government work, to understand the challenges that our future visa system may provoke, and to understand how those who are actually using the system have been finding it and what aspirations they may have for the future.

When discussing the scale of our visa system, I always think it important to remind the House of just how large it is. Thousands of decisions are made every single day, the overwhelming majority of which are completed within published service standards and enable people to visit the UK, to study here, to work here, or to rebuild their lives here. In 2018, UK Visas and Immigration received more than 3.2 million visa applications, of which just under 2.9 million were granted. The service standard for processing a visit visa is 15 working days, and last year UKVI processed 97% within that target. As I have said, the UK welcomes genuine visitors, and more than 2.3 million visitor visas were granted for leisure, study or business visits—an increase of 8% in the past year.

The scale of the work that UK Visas and Immigration undertakes means that it has always used processes that enable it to allocate cases in as streamlined, efficient, and rapid a manner as possible to deliver a world-class visa service. It allocates applications to caseworkers using a streaming tool that is regularly updated with a range of data. The tool is used only to allocate applications, not to decide them. Decision makers do not discriminate on the basis of age, gender, religion or race. The tool uses global and local historical data to indicate whether an application might require more or less scrutiny.

As the hon. Lady explained so comprehensively, an algorithm is a series of instructions or a set of rules that are followed to complete a task. The streaming tool which is operated by UKVI decision-making centres is an algorithm, but I should make it clear that it is not coding, it is not programming, it is not anything that involves machine learning, and, crucially, it is not automated decision making. It is, effectively, an automated flowchart where an application is subject to a number of basic yes/no questions to determine whether it is considered likely to be straightforward or possibly more complex. As I said earlier, the streaming tool is used only to allocate applications, not to decide them.

Chi Onwurah Portrait Chi Onwurah
- Hansard - -

I thank the Minister for the remarks that she is making, and also for the way in which she is responding to my own remarks. She has said that the algorithm is used for allocation purposes. I understood that it was also used to assess risk. That is the “red, amber, green” traffic-light approach, which is about something slightly more than allocation.

Caroline Nokes Portrait Caroline Nokes
- Hansard - - - Excerpts

I am glad that the hon. Lady has made that point, because I was just about to deal with it.

As I have said, a decision maker assesses every application against the immigration rules, on its individual merits, and taking into consideration the evidence provided by the applicant. The effective streaming of applications ensures that those requiring more detailed and closer scrutiny are routed to appropriately trained assessing staff. It is essential in delivering enhanced decision quality by developing robust decision-making structures, and—as the hon. Lady just mentioned—directing a risk-led approach to decision manager reviews. Streaming does not determine the decision; it determines only the process that is undertaken before a decision officer assesses the application and the requirements for decision manager assurance.

Since 2015, UKVI has developed a streaming tool that assesses the required level of scrutiny attached to an application. It is regularly updated with data relating to known immigration abuses, and with locally relevant data. It is also used to prioritise work—for example, when the applicant has paid a priority fee for faster processing.

Streaming indicators can be positive as well as negative, and might include a previous history of travel to the UK and other Five Eyes or EU countries, or previous compliance with immigration rules. The streaming might indicate potential safeguarding concerns. It could also be used to indicate criminal records and of course a sponsor with a very good record of associated compliance. Use of the streaming tool creates a globally consistent approach and supports an objective data-driven approach to the consideration of an application. For every application regardless of its stream, an entry clearance officer must carry out a range of decision-making functions before arriving at a decision, most notably an assessment of whether an application meets the requirements of the relevant immigration rules.

The hon. Lady referred to the Independent Chief Inspector of Borders and Immigration. In 2017 his report on the entry clearance processing operations in Croydon and Istanbul raised no concerns that applications would be refused because of streaming and contained figures that indicated that over 51% of applications streamed as requiring further scrutiny were issued.

The hon. Lady referred to her significant and important work with the all-party group on Africa, and as she said I was very pleased to meet the group earlier this year. She will know that over 47,000 more visas were issued to African nationals in 2018 than in 2016, an increase of 14%. The percentage of African nationals who saw their application granted is up by 4% on 10 years ago and is only slightly below the average rate of the past 10 years of all nationalities. Visa applications from African nationals are at their highest level since 2013. The average issue rate for non-settlement visa applications submitted in the Africa region is consistent with the average issue rate for the past three years, which has been 75%.

The UKVI Africa region is responsible for the delivery of visa services across sub-Saharan Africa. The region currently processes in excess of 350,000 visa applications per year. On average—and in line with other regions—97% of non-settlement visa applications submitted in the Africa region are processed within the 15-day service standard.

There are 31 modern visa application centres in the Africa region, 28 of which offer a range of added-value services and premium products to enhance the customer experience and/or speed of processing. I had the privilege of visiting one of our visa application centres in Africa last year when I visited Nigeria and met a wide range of students who were coming to the UK to study.

The hon. Lady mentioned visas for performers at festivals. I am delighted to see the hon. Member for Edinburgh North and Leith (Deidre Brock) in her place, because I recently had a meeting with her and the Edinburgh festivals organisers. We had what I thought was a very constructive dialogue about problems that international artists may have previously experienced and how to ensure that there are improvements going forward. We are also working closely with the Department for Digital, Culture, Media and Sport to understand the requirements of the creative sector and, as part of the introduction of the future borders and immigration system, which will be phased in from January 2021, we are engaging widely across many sectors and all parts of the UK to work out how we can improve our system.

The hon. Lady asked a wide range of questions, some of which—such as those on the regulation of algorithms and the tech sector—are perhaps not best addressed by the Home Office. I was somewhat sad to have seen the Cabinet Office Minister my hon. Friend the Member for Torbay (Kevin Foster) leave his place. I spent a happy six months at the Cabinet Office as Minister with responsibility for a wide range of matters, including the Government Digital Service. In that role I did not perhaps come to the Chamber to discuss things very much, but the hon. Lady has made an important point about the design of algorithms and the painfully high prevalence of young white men in the sector. We all understand, particularly in terms of artificial intelligence and machine-led learning, that bias can certainly exist—I was going to say creep in, but I fear that is in no way explicit enough. Bias can exist when a narrow demographic is designing algorithms and machine-led learning. We must all be vigilant on that.

I am not going to stand at the Dispatch Box and promise regulation from the Home Office, because that would be inappropriate, but the hon. Lady has made some important points which must be taken up by the Cabinet Office and DDCMS to make sure that we have regulation that is effective and in the right place.

Chi Onwurah Portrait Chi Onwurah
- Hansard - -

I thank the Minister for her remarks, and I appreciate the approach that she has taken. I did not expect the Home Office to make the decisions on how algorithms should function within the Department. I am happy to hear her recognise the concerns that I have raised, but I fear she is coming to the end of her remarks, so may I ask her two things? Will she commit to discussing with the Cabinet Office, or whoever is responsible, how algorithms may or may not be implemented in her Department? I do not know whether she is made aware of this, or whether there is perhaps a working party. Also, will she accept the invitation to help to launch the report of the Africa APPG, from which I have quoted some excerpts in this debate?