Algorithms: Public Sector Decision-making

Lord St John of Bletso Excerpts
Wednesday 12th February 2020

(4 years, 9 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord St John of Bletso Portrait Lord St John of Bletso (CB)
- Hansard - -

My Lords, I too thank the noble Lord, Lord Clement-Jones, for introducing this topical and very important debate, and I am delighted that we have been given six minutes rather than the previously allotted three.

As the Science and Technology Committee in the other place reported in Algorithms in Decision-making, algorithms have been used for many years to aid decision-making, but the recent huge growth of big data and machine learning has substantially increased decision-making in a number of sectors, not just in the public sector but in finance, the legal system, the criminal justice system, the education system and healthcare. I shall not give examples because of the lack of time.

As every speaker has mentioned, the use of these technologies has proven controversial on grounds of bias, largely because of the algorithm developers’ selection of datasets. The question and challenge is how to recognise bias and neutralise it. In deciding upon the relevance of algorithmic output to a decision by a public sector body, the decision-maker should have the discretion to assess unthought of relevant factors and whether the decision is one for which the algorithm was designed. Clearly there is a need for a defined code of standards for public sector algorithmic decision-making. In this regard, I refer to the recommendations of NESTA, which was mentioned by the noble Lord, Lord Clement-Jones. It recommended that every algorithm used by a public sector organisation should be accompanied by a description of its function, objectives and intended impact. If we are to ask public sector staff to use algorithms responsibly to complement or replace some aspects of their decision-making, it is vital that they have a clear understanding of what they are intended to do and in what context they might be applied.

Given the rising use of algorithms by the public sector, only a small number can be reasonably audited. In this regard, there is a recommendation that every algorithm should have an identical sand-box version for auditors to test the impact of different input conditions. As almost all noble Lords have mentioned, there is a need for more transparency about what data was used to train an algorithm, identifying whether there is discrimination on a person’s ethnicity, religion or other factors, a point most poignantly made by the noble Lord, Lord Taylor. By way of example, if someone is denied council housing or a prisoner is denied probation, they need to know whether an algorithm was involved in that decision. If it is proven that an individual was negatively impacted by a mistaken decision made by an algorithm, a recommendation has been made by NESTA that an insurance scheme should be established by public sector bodies to ensure that citizens can receive appropriate compensation.

I shall keep it brief. In conclusion, I do not want to give the impression that I am opposed to the use of algorithms in the decision-making processes of the public sector. The report on AI by our Select Committee, which was so ably chaired by the noble Lord, Lord Clement-Jones—I was lucky enough to be a member—highlighted the huge benefits that artificial intelligence can provide to the public and private sectors. Can the Minister elaborate on the Government’s adoption strategy? With the vast majority of investments in AI coming from the United States as well as from Japan, I believe the UK should focus its efforts to lead the way in developing ethical and responsible AI.