Algorithms: Public Sector Decision-making

Lord Griffiths of Burry Port Excerpts
Wednesday 12th February 2020

(4 years, 2 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - -

My Lords, I am only too glad to add my word of thanks to the humble, ordinary, flesh-and-blood noble Lord, Lord Clement-Jones, for our debate this evening. So many points have been raised, many of them the object of concern of more than one contributor to the debate. I am reminded a little of what happened when we had the big bang in the 1980s: finance went global and clever young people knew how to construct products within the financial sector that their superiors and elders had no clue about. Something like that is happening now, which makes it even more important for us to be aware and ready to deal with it.

I take up the point raised by the noble Lord, Lord Browne of Ladyton, about legislation. He said that it had to be flexible; I would add “nimble”. We must have the general principles of what we want to do to regulate this area available to us, but be ready to act immediately—as and when circumstances require it—instead of taking cumbersome pieces of legislation through all stages in both Houses. The movement is much faster than that in the real world. I recognise what has been said about the exponential scale in the advance of all these methodologies and approaches. We heard ample mention of the Nolan principles; I am glad about that.

On the right of explanation, I picked up an example that it is worth reminding ourselves of when we ask what it means to have an explanation of what is happening. It comes from Italy; perhaps other Members will be aware of it too. An algorithm was used to decide into which schools to send state schoolteachers. After some dubious decision-making by the algorithm, teachers had to fight through the courts to get some sort of transparency regarding the instructions that the algorithm had originally been given. Although the teachers wanted access to the source code of the algorithm—the building blocks, with all the instructions —the Italian Supreme Court ruled that appropriate transparency constituted only an explanation of its function and the underlying legal rules. In other words, it did not give the way in which the method was evolved or the algorithm formed; it was just descriptive rather than analytical. I believe that, if we want transparency, we have to make available the kind of its nuts-and-bolts aspects that lead to the algorithms that are then the object of our concern.

On accountability, who can call the shots? The noble Baroness, Lady Rock, was one of those who mentioned that. I have been reading, because it is coming up, the Government’s online harms response and the report of the House of Commons Science and Technology Committee. I am really in double-Dutch land with it all as I look at how they interleave with each other. Each says things separately and yet together. In the report that I think we will be looking at tomorrow, it is recommended that we should continue to use the good offices of the ICO to cover the way in which the online harms process is taken forward. We have also heard that that may be the appropriate body to oversee all the things that we have been discussing. While the Information Commissioner’s Office is undoubtedly brilliant and experienced, is it really the only regulator that can handle this multiplicity of tasks? Is there a need now to look at perhaps adding something in to recognise the speed at which these things are developing—to say nothing of appointing, as the report suggests, a Minister with responsibility for this area?

I am so glad to see the noble Lord, Lord Ashton, arrive in his new guise as Chief Whip, because, in a previous incarnation, we were eyeball to eyeball like this. He reminds me of course that it was on the Data Protection Bill, as it then was—an enormous, composite, huge thing—that I cut my teeth, swimming against the tide and wondering whether I would drown. It was said then that the Centre for Data Ethics and Innovation was something we should aim at. It needs to happen. Here we are, two years later, and it still has not happened; it is still an aspiration. We must move forward to a competent body that can look at the ethical dimensions of these developments. It must have teeth and it must make recommendations, and it must do so speedily. On that, I am simply repeating what others have said.

Let me finish with one word—it will go into Hansard; it will go against my reputation and I will be a finished man after saying it. When I put my computer on with certain of the things that I do—for example, the Guardian quick crossword, which is part of my morning devotions—the advertising that comes up presumably has been put there by an algorithm. But it suggests that I want to buy women’s underwear. I promise noble Lords that I have no experience in that area at all, and I want to know, as a matter of transparency, what building blocks have gone into the algorithm that has told my computer to interest me in these rather recondite aspects of womenswear.