Digital Economy Bill Debate

Full Debate: Read Full Debate
Department: Scotland Office

Digital Economy Bill

Lord Bishop of Chester Excerpts
Committee: 4th sitting (Hansard): House of Lords
Wednesday 8th February 2017

(7 years, 10 months ago)

Lords Chamber
Read Full debate Digital Economy Act 2017 View all Digital Economy Act 2017 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 80-IV Fourth marshalled list for Committee (PDF, 161KB) - (6 Feb 2017)
Lord Gordon of Strathblane Portrait Lord Gordon of Strathblane (Lab)
- Hansard - - - Excerpts

My Lords, I, too, support the amendment. Yesterday, along with many of your Lordships, I attended a meeting with Channel 4 on the subject of fake news. Here we are not talking about opinion, where people can legitimately take one view or another in a democracy, but about things that are demonstrably totally false. Yet there is no mechanism at the moment for screening them out of social media. If in the United States 44% of the population regard Facebook as their primary source of news, there are dangers for democracy.

I do not know whether the noble Lord’s amendment will work. I do not know whether, for example, the companies will regard algorithms as commercially confidential and refuse to release them. I do not know what powers we actually have over these bodies, but it is worth exploring. It would be ridiculous if this massive Bill, which deals very well for the most part with a wide range of subjects, were to leave out the most topical and potentially the most dangerous of all: social media.

Lord Bishop of Chester Portrait The Lord Bishop of Chester
- Hansard - -

My Lords, this is an important amendment because it touches upon the bigger issue of the impact of artificial intelligence on all sorts of aspects of our lives. There is a law called Moore’s law, which says that every two years the power of computers doubles. That has been true over the past 20 or 30 years and we should assume that that power will continue to develop. Artificial intelligence in all its impacting forms will be more and more prevalent in our society and more and more potent in the hands of terrorists in the years to come.

We cannot ask Ofcom to solve all the problems in this area, but I would like to know where the ownership of these risks and the rapid changes in our society falls in the eyes of the Government. Perhaps Ofcom has a role in this regard—search engines or whatever—but it is really part of a bigger picture of how we get ahead of the game with the impact of artificial intelligence. We read in the papers about driverless cars appearing on our streets, and in many other areas of life artificial intelligence will impact upon us. Where is this owned in the corridors of government?

Baroness Byford Portrait Baroness Byford (Con)
- Hansard - - - Excerpts

My Lords, I would like to support my noble friend in his amendment. Algorithms are basically mathematical. The power of computers is used to record, classify, summarise and project actions that indicate what is happening in the world around about us. Algorithms can be applied in particular to social media, which other noble Lords have referred to, and to normal internet usage and browsing. They reach decisions about public interest, about you and about me.

According to a recent radio programme, algorithms are used to make individual decisions in the fields of employment, housing, health, justice, credit and insurance. I had heard that employers are increasingly studying social media to find out more about job applicants. I had not realised that an algorithm, programmed by an engineer, can, for example, take the decision to bin an application. If that is true, that is absolutely unacceptable. It certainly explains why so many jobseekers do not receive a response of any kind. There is a very real danger that a person could also be refused a mortgage or a better interest rate as the result of an algorithmic decision. Even now some companies use algorithms based on phone numbers to decide whether a caller is high or low value. Highs get to speak to a person: lows are left holding on until they hang up. Algorithm designers refuse to answer any questions, I understand, about the data that are used or their application on grounds of commercial confidentiality. There are real concerns that if we continue to allow such liberties, there will be an increasing risk of discrimination—intentional or accidental—against people of certain races, religions or ages. One example of algorithm use cited in the programme was that of differential pricing by Uber.

The EU intends that by July 2018 citizens will have the right to an explanation of decisions affected by the workings of these algorithms, such as the online rejection of a bank loan. I do not feel that we should wait until then, and although my noble friend’s amendment might not be perfect, I am really grateful that he has tabled it today and that we are having this worthwhile debate.