Algorithms: Public Sector Decision-making Debate
Full Debate: Read Full DebateLord Addington
Main Page: Lord Addington (Liberal Democrat - Excepted Hereditary)Department Debates - View all Lord Addington's debates with the Department for Digital, Culture, Media & Sport
(4 years, 9 months ago)
Lords ChamberMy Lords, I thank my noble friend for bringing this subject to our attention. The noble Lord, Lord Giddens, went for the big picture; I will, rather unashamedly, go back to a very small part of it.
Bias in an algorithm is quite clearly there because it is supposed to be there, from what I can make out. When I first thought about the debate, I suddenly thought of a bit of work I did about three years ago with a group called AchieveAbility. It was about recruitment for people in the neurodiverse categories—that is, those with dyslexia, dyspraxia, autism and other conditions of that nature. These people had problems with recruitment. We went through things and discovered that they were having the most problems with the big recruitment processes and the big employers, because they had isometric tests and computers and things and these people did not fit there. The fact is that they processed information differently; for example, they might not want to do something when it came round. This was especially true of junior-level employment. When asked, “Can you do everything at the drop of a hat at a low level?”, these people, if they are being truthful, might say, “No”, or, “I’ll do it badly or slowly.”
The minute you put that down, you are excluded. There may be somewhere smaller where they could explain it. For instance, when asked, “Can you take notes in a meeting?”, they may say, “Not really, because I use a voice-operated computer and if I talk after you talk, it’s going to get a bit confusing.” But somebody else may say, “Oh no, I’m quite happy doing the tea.” In that case, how often will they have to take notes? Probably never. That was the subtext. The minute you dump this series of things in the way of what the person can do, you exclude them. An algorithm—this sort of artificial learning—does not have that input and will potentially compound this problem.
This issue undoubtedly comes under the heading of “reasonable adjustment”, but if people do not know that they have to change the process, they will not do it. People do not know because they do not understand the problem and, probably, do not understand the law. Anybody who has had any form of disability interaction will have, over time, come across this many times. People do it not through wilful acts of discrimination but through ignorance. If you are to use recruitment and selection processes, you have to look at this and build it in. You have to check. What is the Government’s process for so doing? It is a new field and I understand that it is running very fast, but tonight, we are effectively saying, “Put the brakes on. Think about how you use it correctly to achieve the things we have decided we want.”
There is positive stuff here. I am sure that the systems will be clever enough to build in this—or something that addresses this—in future, but not if you do not decide that you have to do it. Since algorithms reinforce themselves, as I understand it, it is quite possible that you will get a barrage of good practice in recruitment that gives you nice answers but does not take this issue into account. You will suddenly have people saying, “Well, we don’t need you for this position, then.” That is 20% of the population you can ignore, or 20% who will have to go round the sides. We really should be looking at this. As we are looking at the public sector here, surely the Government, in their recruitment practices at least, should have something in place to deal with this issue.
I should declare my interests. I am dyslexic. I am the president of the British Dyslexia Association and chairman of a technology company that does the assistive technology, so I have interests here but I also have some knowledge. If you are going to do this and get the best out of it, you do not let it run free. You intervene and you look at things. The noble Lord, Lord Deben, pointed out another area where intervention to stop something that you do not want to happen happening is there. Surely we can hear about the processes in place that will mean that we do not allow the technology simply to go off and create its own logic through not interfering with it. We have to put the brakes on and create some form of direction on this issue. If we do not, we will probably undo the good work we have done in other fields.