(7 years, 1 month ago)
Lords ChamberMy Lords, having restrained myself for four and a half hours and having done a huge amount of work in the Library, I will, despite the amendment having been given only a few minutes, detain your Lordships for a few more moments. This is a massive issue.
As a member of the AI committee chaired by the noble Lord, Lord Clement-Jones, I have been struggling to find analogies for just how serious the world we are moving into is becoming. What I have come up with, with the help of the Library, is road safety. I am going to talk about ethics. Probably the most well-known and successful ethicist in your Lordships’ Chamber is the noble Baroness, Lady O’Neill. Last week, when discussing what this Bill is really all about, she put her finger on it. She asked of the Minister:
“Is he suggesting that the aim should be to adapt children to the realities of the online world and the internet service providers, rather than to adapt the providers to the needs of children?”.—[Official Report, 6/11/17; col. 1606.]
This seems to be fundamental to the issue. Because I needed an analogy, I started looking into road safety, and found it very interesting and—if noble Lords will give me a couple of minutes—rather instructive.
In 1929, a royal commission met, having been required to urgently legislate on road safety because of the “slaughter” that was occurring on the roads. I will not take up your Lordships’ time reading out all the information that I got from the Library, but I have it all here. Parliament legislated in 1930, pretty ineffectively, and again in 1932, again ineffectively. In 1934, your Lordships’ House passed a Bill on road safety, which was rejected in another place because of the objections of lobbyists from the automobile industry, the oil industry and the insurance industry. Parliament tried again in 1938, and once again failed.
Here, I must read something extraordinary. Lord Cecil of Chelwood, a Conservative Peer, said at the end of the debate on the report regarding the legislation:
“I believe future ages will regard with consternation the complacency, the indifference with which this slaughter and mutilation on the roads is now regarded. I observe with great interest that in the final paragraph of the Report the members of the Committee themselves say that they are puzzled and shocked … by the complacency with which this matter is regarded”.—[Official Report, 3/5/1939; col. 903.]
Thousands of people were being killed. I put it to the House that if we get this Bill wrong, a lot of people will be hurt; if we get it right, we may save lives. That is how important it is.
I am standing here today because of a man named Ralph Nader. Through an extraordinary series of events in the 1960s, Ralph Nader was able to impose on the American automobile industry, against its wishes, seatbelts. Six years ago in Italy, my life was saved by the combination of a seatbelt and an airbag, so I take this issue pretty seriously. Look at what has happened since 1990 to the number of lives saved by the utilisation of technology that existed 20, 30 and 40 years prior to that—it is extraordinary. In 1930, almost 8,000 people were killed on the roads of Britain, with one million registered vehicles on the road. Last year, fewer than 2,000 people were killed, with 35 million registered vehicles on the road. That is because, at last, technology was brought to bear—against the wishes of the industry lobbyists.
We must understand that there are those who would like this Data Protection Bill to be weak. It is our duty to ourselves and to future generations to make it extremely tough and to not allow ourselves to be undermined by the views of the many sectors of industry that do not share our values.
My Lords, it is a pity I have to be brief, but I will try. The amendment is interesting and worth debating in greater detail than the time today allows. Remarks have already been made about the report from the Royal Society and the British Academy, which suggested setting up a body but did not define whether it ought to be statutory. It is a pity it did not because, if it had, perhaps the Government would have taken greater notice of the suggestion and taken on board what pages 81 and 82 of their manifesto said that they would do—set up a commission.
To me, there are three important things for any body that is set up. First, it must articulate and provide guidance on the rules, standards and best practices for data use, ideally covering both personal and non-personal data. I see this amendment as restrictive in that area. Secondly, it must undertake horizon scanning to identify potential ethical, social and legal issues emerging from new and innovative uses of data, including data linkage, machine learning and other forms of artificial intelligence, and establish how these should be addressed. Thirdly, and importantly, it should be aligned with, and not duplicate, the roles of other bodies, including the ICO as the data protection regulator and ethics committees making decisions about particular research proposals using people’s data. This important amendment allows us to discuss such issues and I hope we will return to it and perhaps make it wider.
Is such a body necessary? The debates we have had suggest that it might be. The Nuffield Foundation was mentioned. It has suggested that it will set up an ethics commission, and we need to know what the purpose of that will be. What would its role be in the regulatory framework, because it would not be a statutory body? I look forward to that debate but, in the meantime, I support the amendment.