Automated Vehicles Bill [HL] Debate
Full Debate: Read Full DebateLord Cameron of Dillington
Main Page: Lord Cameron of Dillington (Crossbench - Life peer)Department Debates - View all Lord Cameron of Dillington's debates with the Department for Transport
(9 months, 3 weeks ago)
Lords ChamberMy Lords, I repeat the declarations of interest that I have made in the past.
I applaud the principles behind the suggestions made by the noble Lord, Lord Berkeley. However, there is a difficulty in coming up with new regulations that are different from elsewhere in the world, and I am afraid that “significantly” falls into that trap. It would make it a lot harder for international companies to work out exactly what was meant by these words. There is no established case law on these matters.
We all know that there are problems with existing human drivers, and we should expect that all autonomous vehicles turn out to be dramatically better than human beings. We should not look for circumstances where humans monitor computers but rather the other way around; computers will be better than humans at this. A lot of people suggest that car insurance will actually reduce when the number of autonomous vehicles increases. So I am afraid that I can only applaud the amendment produced by my noble friend the Minister and reject those proposed by the noble Lord, Lord Berkeley.
I hope the House will forgive me, but these various amendments on safety prompt me to ask the Minister about something that has not featured much in our discussions: the issue of hacking into self-driving vehicles—SDVs. It was touched on peripherally during the debate on data protection in Committee but not really highlighted as a major safety concern, which is why I thought I would bring it up now.
I sat on the House’s Science and Technology Committee when it produced its report on automated vehicles some five or six years ago—I am afraid the doldrums of Covid blur my account of time. I remember that during that committee’s investigation, we spent some time discussing in detail the question of hacking into these vehicles, and I felt it only right that it should feature in our discussions on safety today.
We all know how easy it is for someone, or some group of someones, to hack into our computers from a distance, and it could be a criminals or, worse, an enemy state. Why should it not be the same with an SDV? I raised this subject with Waymo and others, but I have to say that I was not convinced by its assurances that it could not happen. We all know that both at Microsoft and here in Parliament it takes a team of experts, sometimes working around the clock, to keep all our devices free from hackers, and an SDV will just be another device.
I was going to bring this matter up when the noble Lord, Lord Lucas, who is not in his place, had an excellent amendment on the obvious necessity for our emergency services to be able to talk to or even control SDVs in certain circumstances. Sadly, however, I could not be here on the 10 January. I was going to say that if it is too easy for a policeman, an ambulance driver or a fireman to get sufficient access to control an SDV, I feel sure that it will not be impossible for someone with malicious intent to get hold of whatever device or code that makes this possible. Could it be that stealing a car will become easier, and that a suicide bomber will now no longer need to commit suicide but just hack into someone else’s car or an SDV for hire and drive it into a crowd or the gates of Parliament, for example? Or maybe you could commit murder by getting control of a car and driving it into your intended victim. It is also entirely possible that no one would know who had done it, because it had been done from a considerable distance—maybe from the other side of the world.
I do not know whether any of your Lordships have seen a series called “Vigil”, one of these television thriller fictions, in which an armed remote-control drone was captured remotely and used to create death, destruction and mayhem on British soil. However, no one knew who was controlling it, which was the essence of the whodunnit plot. Incidentally, it turned out that it was being controlled all the way from the Middle East. I am afraid my thoughts leapt—rather melodramatically, I admit—from that fiction to the reality of what we are trying to achieve here with the Bill.
I am sure there are technical solutions to all these issues, and the whiz-kids on either side of the good-versus-evil divide will continuously compete with one other to win the war of control. It occurred to me, for instance, that perhaps all policemen should be issued with a zapper that brings to a dead halt any SDV that appears to be behaving dangerously. That may be too drastic a solution but, believe me, we will need some solution. My point is that we are entering a brave new world, and we need to properly think through all the problems we are going to encounter. We particularly need to ensure that SDVs become an accepted and safe reality.
I did not want our debate on the safety of these vehicles or the future to pass without a serious commitment from government to being always on the alert to controlling or at least minimising this safety problem. Therefore, by way of a question, I would like reassurance from the Minister that before companies can be licensed to produce SDVs, there will be checks, monitoring and even the holding of emergency real-life exercises with the police to test against what they would do if a dangerous hacker got control of a vehicle.
Will the Government commit to ongoing vigilance over the licensing process, the manufacturers, the operators, the car hire companies, the taxi services and the so-called Uber 2s, and so on, to minimise the dangers from malicious hackers? I realise, of course, that all this vigilance will not eradicate the danger of hacking into such self-driving devices. It is clear that we are unlikely to ever see the end of people trying to get into our other devices, our banking services and the like, but I hope that ongoing vigilance will at least minimise this particular safety risk.