(7 years, 9 months ago)
Lords ChamberMy Lords, I beg to move Amendment 214. We all know that Ofcom has a great interest in traditional media. As we can see, not least from Clauses 70 and 71, we are happy to give Ofcom a panoptic role when this is required. My amendment is designed to give Ofcom a panoptic role in new media.
We are all familiar with algorithms, particularly in such contexts as a Google search. It is just a set of rules and procedures that gets us to where we want to go from wherever we happen to be. I do not know of any great harm currently being done by any algorithms, but we ought to be aware of the power these procedures have in our lives. They govern the choice of what people see on the internet. The potential for this to interfere with news flow is obvious. If you type something into Google, it decides what you get to see. In the context of a referendum or an election, the potential for altering the result is clear. It also has an effect when you are just looking round to see what is there. Google has had trouble recently with its response to people typing in “are Jews”; it was autocompleting that with the word “evil”. This has now ceased, but it shows what influence algorithms can have in directing people to particular sources of information—in this case, with particularly nasty implications.
The function of an algorithm is to discriminate, but how are algorithms discriminating? What do we know about what they are doing in terms of fairness, when it comes to race or gender, in the context of job offers, accommodation or access? Referring again—I am sure unfairly—to Google, there was an episode last year when, if you put “three black teenagers” into the Google image search, you got mug shots of prisoners; but if you put in “three white teenagers” you did not. How do we know the effects of these things on our lives? If people start trying to correct them, what effect will these corrections have?
Most of these algorithms—or at least the big ones—are run by large, dominant, international organisations. Who controls them? We think we have some idea but there is no predictability; there does not seem to be any effective system of governance, least of all by government or institutions. They are a law unto themselves and they will continue to be so, unless something fantastic changes.
Under these circumstances, we ought to know what is going on. We ought to have the ability to take a look and make sure that it is fair and as we wish it to be, as we do in similar areas of the old media and of life. I hope my amendment will enable Ofcom to do just that. I beg to move.
My Lords, I support the amendment. There is a huge amount of power in the hands of search engines regarding the way they influence how people think. This could be used as a form of propaganda, as we have seen with the recent rows about fake news. From the point of view of protecting Britain, there could even be some security implications because of the way they could affect how people think. So it is quite a sensible power to have, just in case.
My Lords, I too support the amendment. I thank the noble Lord for his explanation of what an algorithm is. I always found BBC Bitesize’s explanation rather helpful—a set of rules to solve a problem—along with its corresponding explanation of how an algorithm can go wrong: a set of rules designed for getting dressed that insists on your coat going on before your jumper. This would lead to a great many children arriving at school in sartorial disarray. It helpfully indicates that a set of rules is not benign—it has a purpose and a process, both of which are man or woman-made.
It is not possible to exaggerate the importance of an algorithm. I recently read Weapons of Math Destruction, by Cathy O’Neil, a Harvard PhD and Wall Street quantitative analyst. It goes step by step through the ways in which algorithms—apparently neutral and benign—have the capacity to change lives in huge ways and in an ever-increasing list of scenarios. If wrongly attributed or designed, they can have devastating effects on job prospects, education, financial outcomes or the reputation of an individual, with very little possibility of appeal, correction or compensation.
My Lords, I also thank the noble Lord, Lord Lucas, for putting down this amendment. Indeed, this amendment has many good aspects to it, but I will adopt a phrase which the noble and learned Lord, Lord Keen, used the other day, which is, “It doesn’t go nearly far enough”. It really highlights—and I declare an interest as the co-chair of the new All-Party Parliamentary Group on Artificial Intelligence—some of the issues that this Bill simply does not deal with. This does need huge consideration: all the ethics involved not only with artificial intelligence, but with the internet of things and the great many regulatory issues which spring from the new technologies. Algorithms are a part of it, of course they are, but we need further consideration.
I agree with those who have said that perhaps Ofcom is not necessarily the best regulator for this—I do not know—and it may be that we need to construct a purpose-built regulator for the world of artificial intelligence and the internet of things in ethical terms. I am not quite sure that Ofcom has got the necessary tools in terms of the ethics aspect of it.
I am very much in spirit with the noble Lord and I am delighted that he has raised it, but we are at the very beginning of a really important debate on a lot of these areas. The essence of all this is trust. We had a continuous debate through Part 5 about the government sharing of data. This is about the private sector and its use of a lot of our data and the way it sorts them and places them in the search engines. Trust is the overwhelming issue of our age, and we are not there yet. If we are going to reassure people who want to use these new technologies, we really do need to come up with a proper regulatory system. I do not think that this new clause quite provides it.
Before the noble Lord sits down, may I just ask him: is it not dangerous to make perfection the enemy of better? In other words, the amendment may not be perfect, but it is moving in the right direction, and to say, “Do nothing”, because it is not perfect is surely very unwise, given all the other stuff that he has said.
My Lords, I know that the noble Earl himself is perfect in almost every way, so I would very much hesitate to argue with him. Still, I feel we need something rather broader than this proposal would provide.
(8 years, 2 months ago)
Lords ChamberClearly, I am not in a position to comment on a particular case. However, in the context of what is said at paragraph 5.42, one has to remember that there is the further issue of whether it would have been in the public interest to make disclosure. That necessary test would have had to be met before there would have been disclosure, however serious the original breach.
My Lords, I have been listening to the debate and realised that of course people are concerned because they do not know what information is held. Sometimes people get into trouble because something is held on file and they do not know what it is. Only the subject knows what affects them and what does not. To take the example just given, where data may have been gathered by someone who is subsequently fired, that information may have been quite sensitive if revealed to someone in another organisation, and only the individual who was the subject of those unauthorised requests would know that. Therefore, this area bears examination. I am not sure how we should deal with that, but to rely just on the commissioner to know exactly how this would affect everyone would be difficult as well. It is worth thinking about this further.
My Lords, on the example my noble friend mentioned, it is hard to think that it would not be in the public interest for somebody who has been the subject of,
“a number of unauthorised searches for related communications data”,
to be notified. Of course I thank the noble and learned Lord for his detailed reply, although I am not sure whether he responded to my amendment on the code of practice.
I do not disagree about the national interest but it does not answer my point about reversing the burden so that the default position would be that there is notification unless it is not in the public interest—or, to put it another way, notification rather than notification only if it is in the public interest that somebody is informed.
On telecommunications operators and the report to the ICO, as the Bill seeks to do throughout, I sought to join up some of the dots in this landscape. Importantly, on the Human Rights Act, the noble and learned Lord says that the considerations in Clause 2 are not relevant; we may have another go at this on Report with a slightly different approach. However, he also said—I know that this was simply a turn of phrase—that Clause 207(3) does not weaken Clause 2, “I suggest”. I hope that he will be able to say that that amounts to an assurance to the Committee. Perhaps I may invite him to do that, otherwise we will certainly come back to this for an assurance.