(4 years, 9 months ago)
Lords ChamberMy Lords, I congratulate the noble Lord, Lord Clement-Jones, on securing this important debate. It is a topic that I know is close to his heart. I had the privilege of serving on the Select Committee on Artificial Intelligence which he so elegantly and eloquently chaired.
Algorithmic decision-making has enormous potential benefits in the public sector and it is therefore good that we are seeing growing efforts to make use of this technology Indeed, only last month, research was published showing how AI may be useful in making screening for breast cancer more efficient. The health sector has many such examples but algorithmic decision-making is showing potential in other sectors too.
However, the growing use of public sector algorithmic decision-making also brings challenges. When an algorithm is being used to support a decision, it can be unclear who is accountable for the outcome. Who is the front-line decision-maker? Is it the administrator in charge of the introduction of the Al tool, or perhaps the private sector developer? We must make sure that the lines of accountability are always clear. With more complex algorithmic decision-making, it can be unclear why a decision has been made. Indeed, even the public body making the decision may be unable to interrogate the algorithm being used to support it. This threatens to undermine good administration, procedural justice and the right of individuals to redress and challenge. Finally, using past data to drive recommendations and decisions can lead to the replication, entrenchment and even the exacerbation of unfair bias in decision-making against particular groups.
What is at stake? Algorithmic decision-making is a general-purpose technology which can be used in almost every sector. The challenges it brings are diverse and the stakes involved can be very high indeed. At an individual level, algorithms may be used to make decisions about medical diagnosis and treatment, criminal justice, benefits entitlement or immigration. No less important, algorithmic decision-making in the public sector can make a difference to resource allocation and policy decisions, with widespread impacts across society.
I declare an interest as a board member of the Centre for Data Ethics and Innovation. We have spent the last year conducting an in-depth review into the specific issue of bias in algorithmic decision-making. We have looked at this issue in policing and in local government, working with civil society, central government, local authorities and police forces in England and Wales. We found that there is indeed the potential for bias to creep in where algorithmic decision-making is introduced, but we also found a great deal of willingness to identify and address these issues.
The assessment of consequences starts with the public bodies using algorithmic decision-making. They want to use new technology responsibly, but they need the tools and frameworks to do so. The centre developed specific guidance for police forces to help them trial data analytics in a way that considers the potential for bias—as well as other risks—from the outset. The centre is now working with individual forces and the Home Office to refine and trial this guidance, and will be making broader recommendations to the Government at the end of March.
However, self-assessment tools and a focus on algorithmic bias are only part of the answer. There is currently insufficient transparency and centralised knowledge about where high-stakes algorithmic decision-making is taking place across the public sector. This fuels misconceptions, undermines public trust and creates difficulties for central government in setting and implementing standards for the use of data-driven technology, making it more likely that the technology may be used in unethical ways.
The CDEI was pleased to contribute to the recently published report from the Committee on Standards in Public Life’s AI review, which calls for greater openness in the use of algorithmic decision-making in the public sector. It also is right that the report calls for a consistent approach to formal assessment of the consequences of introducing algorithmic decision-making and independent mechanisms of accountability. Developments elsewhere, such as work being done in Canada, show how this may be done.
The CDEI’s new work programme commences on 1 April. It will be proposing a programme of work exploring transparency standards and impact assessment approaches for public sector algorithmic decision-making. This is a complex area. The centre would not recommend new obligations for public bodies lightly. We will work with a range of public bodies to explore possible solutions that will allow us to know where important decisions are being algorithmically supported in the public sector, and consistently and clearly assess the impact of those algorithms.
There is a lot of good work on these issues going on across government. It is important that we all work together to ensure that these efforts deliver the right solutions.
(7 years, 2 months ago)
Lords ChamberMy Lords, I also commend my noble friend Lady Lane-Fox for securing this debate but, beyond that, for continuing to champion the digital and tech agenda as she does with such alacrity and passion. We have heard many fascinating speeches and insights this afternoon, so I will keep my comments brief and to two areas. The first is digital’s contribution to our economy and our global competitiveness. To coin a once popular phrase, if we are to win the global race, delivering the pipeline of digital skills and digital understanding is a necessary condition of success.
There are lots of positive signs. Tech City UK’s recent Tech Nation report found that in 2016 UK digital tech investment reached £6.8 billion—higher than any other European country. However, we need to do more if we want to reap the benefits of moving to a fully digital, tech-savvy economy. For example, according to research from O2, 745,000 additional workers with digital skills are needed to meet rising demand from employers over the period 2013-17. I am interested to hear from the Minister whether we are on track.
What more needs to be done in policy, particularly, as my noble friend Lord Baker mentioned, on education? One example is coding and software development. Coadec—the Coalition for a Digital Economy—has identified key areas. One concern is mathematics and a lack of students taking further maths qualifications—a necessary precursor for developer training. Indeed, data show that for the proportion of students studying any maths after 16 years old, England is in the 0% to 10% category, yet countries as diverse as Taiwan, Russia and Japan are in the 95% to 100% category.
The second area that I want to consider is something that the noble Baroness, Lady Lane-Fox, has spoken about—that this challenge does not merely concern new and exciting digital factors but is also about whether our entire population can participate in the life of the nation. We need digital skills to participate, but we also need the understanding to equip us to deal with the rapidly changing technological landscape. I am delighted to be participating in the House of Lords Artificial Intelligence Committee. As my noble friend Lord St John mentioned, this area is evolving rapidly, enhancing diverse areas from healthcare to finance. But AI is also making us subject to decisions made by algorithms without fully understanding how they work and how AI may affect humanity.
Coadec suggests making access to digital education free for all adults just as we have done with adult literacy, with good results. I could not agree more. We must capitalise on all opportunities for global Britain, particularly in the light of Brexit, but we must also realise that improving digital understanding at all levels is an opportunity to increase participation in our national life. Winning the global race means ensuring that everyone can take part.