Digital Economy Bill Debate

Full Debate: Read Full Debate
Department: Scotland Office

Digital Economy Bill

Baroness Byford Excerpts
Committee: 4th sitting (Hansard): House of Lords
Wednesday 8th February 2017

(7 years, 10 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 80-IV Fourth marshalled list for Committee (PDF, 161KB) - (6 Feb 2017)
Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - - - Excerpts

My Lords, this is an important amendment because it touches upon the bigger issue of the impact of artificial intelligence on all sorts of aspects of our lives. There is a law called Moore’s law, which says that every two years the power of computers doubles. That has been true over the past 20 or 30 years and we should assume that that power will continue to develop. Artificial intelligence in all its impacting forms will be more and more prevalent in our society and more and more potent in the hands of terrorists in the years to come.

We cannot ask Ofcom to solve all the problems in this area, but I would like to know where the ownership of these risks and the rapid changes in our society falls in the eyes of the Government. Perhaps Ofcom has a role in this regard—search engines or whatever—but it is really part of a bigger picture of how we get ahead of the game with the impact of artificial intelligence. We read in the papers about driverless cars appearing on our streets, and in many other areas of life artificial intelligence will impact upon us. Where is this owned in the corridors of government?

Baroness Byford Portrait Baroness Byford (Con)
- Hansard - -

My Lords, I would like to support my noble friend in his amendment. Algorithms are basically mathematical. The power of computers is used to record, classify, summarise and project actions that indicate what is happening in the world around about us. Algorithms can be applied in particular to social media, which other noble Lords have referred to, and to normal internet usage and browsing. They reach decisions about public interest, about you and about me.

According to a recent radio programme, algorithms are used to make individual decisions in the fields of employment, housing, health, justice, credit and insurance. I had heard that employers are increasingly studying social media to find out more about job applicants. I had not realised that an algorithm, programmed by an engineer, can, for example, take the decision to bin an application. If that is true, that is absolutely unacceptable. It certainly explains why so many jobseekers do not receive a response of any kind. There is a very real danger that a person could also be refused a mortgage or a better interest rate as the result of an algorithmic decision. Even now some companies use algorithms based on phone numbers to decide whether a caller is high or low value. Highs get to speak to a person: lows are left holding on until they hang up. Algorithm designers refuse to answer any questions, I understand, about the data that are used or their application on grounds of commercial confidentiality. There are real concerns that if we continue to allow such liberties, there will be an increasing risk of discrimination—intentional or accidental—against people of certain races, religions or ages. One example of algorithm use cited in the programme was that of differential pricing by Uber.

The EU intends that by July 2018 citizens will have the right to an explanation of decisions affected by the workings of these algorithms, such as the online rejection of a bank loan. I do not feel that we should wait until then, and although my noble friend’s amendment might not be perfect, I am really grateful that he has tabled it today and that we are having this worthwhile debate.

Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, I also thank the noble Lord, Lord Lucas, for putting down this amendment. Indeed, this amendment has many good aspects to it, but I will adopt a phrase which the noble and learned Lord, Lord Keen, used the other day, which is, “It doesn’t go nearly far enough”. It really highlights—and I declare an interest as the co-chair of the new All-Party Parliamentary Group on Artificial Intelligence—some of the issues that this Bill simply does not deal with. This does need huge consideration: all the ethics involved not only with artificial intelligence, but with the internet of things and the great many regulatory issues which spring from the new technologies. Algorithms are a part of it, of course they are, but we need further consideration.

I agree with those who have said that perhaps Ofcom is not necessarily the best regulator for this—I do not know—and it may be that we need to construct a purpose-built regulator for the world of artificial intelligence and the internet of things in ethical terms. I am not quite sure that Ofcom has got the necessary tools in terms of the ethics aspect of it.

I am very much in spirit with the noble Lord and I am delighted that he has raised it, but we are at the very beginning of a really important debate on a lot of these areas. The essence of all this is trust. We had a continuous debate through Part 5 about the government sharing of data. This is about the private sector and its use of a lot of our data and the way it sorts them and places them in the search engines. Trust is the overwhelming issue of our age, and we are not there yet. If we are going to reassure people who want to use these new technologies, we really do need to come up with a proper regulatory system. I do not think that this new clause quite provides it.