Automated Vehicles Bill [Lords] Debate
Full Debate: Read Full DebateClive Efford
Main Page: Clive Efford (Labour - Eltham and Chislehurst)Department Debates - View all Clive Efford's debates with the Department for Transport
(9 months, 2 weeks ago)
Commons ChamberMy right hon. Friend should note that data for these purposes will be protected in the usual way. Data has to be used for the purposes for which it was gathered. There are legal processes for who has access to it, as well as those we will set out specifically for driving purposes. The other things he mentioned will be governed by the usual laws that govern the use of data. I do not want to dwell on those specifics, but they are already covered by existing data protection legislation for the devices that people have in vehicles to monitor their progress or for mobile phones.
I would like to start with safety. Anyone stepping into a self-driving vehicle will reasonably ask: “Can this car consistently drive safety? Will the law protect me if there is an accident? Is the manufacturer regulated and can they be held to account?” Under this legislation, the answer to each of those questions will be yes. The Bill has been built on a bedrock of safety, protecting not just the driver inside the car but road users outside the vehicle.
As I mentioned in answer to the hon. Member for Easington (Grahame Morris), I chaired a roundtable with road safety groups last week and explained how we are holding self-driving vehicles to a higher safety standard than the average human driver, guided by principles we will soon consult on; how these vehicles must meet rigorous technical requirements before rolling off production lines and being authorised for our roads; and how we will tackle misleading marketing, with new offences for companies that seek to blur the line between true self-driving and driver assistance.
That gets to the nub of the point. Because these vehicles are going to be automated, they will be governed by an algorithm written by a human being somewhere remote from where an accident might occur. How do we determine whether the primary purpose of that algorithm is to protect the person in the car or someone outside the car, such as a pedestrian or a child crossing the road? How does the algorithm make a choice in those circumstances?
We will consult on the statement of safety principles, which will set out the governing principles of the legislation. On the specifics, this will be about making sure that the manufacturers—those who create the software and those who put the cars together—have rigorous processes for testing and decision-making. Those systems will have to be authorised to be used in our cars, and it will be important to look at their data and their track records. As I say, in real-world situations where these vehicles are being used—for example, in California—the evidence suggests that they have a very good safety record that is much better than that of human drivers. There is a big opportunity here to have a safer road environment, not just for the users of the vehicles but for other road users.
I do not know whether the hon. Gentleman has had the opportunity to ride in a self-driving vehicle, but the data they collect of their surroundings is interesting. My personal observation is that the space they give when passing a cyclist, for example, is a lot more generous than that I have seen many human drivers give. Of course, those parameters are going to be set and regulated, and people will have to be assured that the vehicles are safe before they are on the road. Ultimately, the manufacturer will be legally responsible if they turn out not to be.
I am grateful to the Secretary of State for giving way a second time. I agree entirely that, overall, roads will be safer with automated vehicles, but there will still be accidents. My question was specifically about where there is an accident and there is a choice to be made about protecting the person inside the car and injuring someone outside the car. How do we determine what takes priority in those circumstances?
We will consult on the safety principles, but with some of this stuff we have to look at the way the vehicles make decisions. We cannot possibly legislate for every single set of circumstances. In the same way, when there is a collision involving a vehicle with a human driver, the driver will make the best decision they can in the specific circumstances. Sometimes those situations lead to legal conflict and then people have to make a judgment. We cannot legislate for every single one of those circumstances in advance. What we can do is make sure there are robust systems that make good decisions based on the best data, and then look at the track record. We will also set up a regulatory system whereby any accident involving an automated vehicle will be properly investigated.
The hon. Gentleman raises an important point. It is essential with this legislation that we earn the public’s trust and win their confidence. That is one of the reasons why we have been so clear, and why we accepted the amendments in the other place, about putting safety at the forefront of the Bill. If people are not persuaded of that, this technology will not make much progress.
The simple answer is, yes, we are going to do that. The hon. Gentleman is right to raise cyber-security as an issue, and it is of course an issue today, because many cars today have electronic features from keyless entry to navigation systems. Existing cars are vulnerable to being hacked. Cyber-security is important and we and the industry are working with the National Cyber Security Centre. I agree that cyber-security will be very important, but it already is important.
I agree with what the Secretary of State said about tinkering and that nullifying any insurance, but we have also just experienced the Horizon scandal, where the manufacturers themselves had access to the technology. What security do drivers have from the designers of the software governing these cars covering their own backs?
One of the things we will have in place is a duty of candour. We will also set up a regulatory process with investigations of every self-driving vehicle involved in an incident. Importantly, manufacturers will be legally obliged to have that duty of candour to disclose the information, so that these issues can be got to the bottom of. The hon. Member raises a specific case that I will not comment on, and there will no doubt be learnings from that case, but the regulatory approach we are setting up will deal with the issue he just raised.
I did not intend to give a speech in this debate—I just wanted to intervene— but as there were so few of us contributing, I thought I would make a short contribution at the end. I am grateful to you for allowing me to do so, Mr Deputy Speaker.
I accept that the time has come for this technology. As somebody who worked in the transport industry for many years prior to becoming a Member of Parliament, I accept that we cannot stand in the way of this technology and that, overall, our road network will be safer with the advance of autonomous vehicles. None the less, there will be occasions when accidents occur, and we have to accept that we will be legislating for how vehicles respond in those circumstances. At the moment, if an accident happens, it happens in real time and people behind the wheels of the vehicles make real-time decisions to try to minimise the impact. However, automated vehicles will have to be programmed in advance to respond in a particular way in certain circumstances—we cannot get away from that. The fact is that the people designing the algorithms will be doing so remotely and well in advance of any accident happening.
Who is the primary person to consider when an accident takes place? Is it the person or persons in the vehicle, or is it the pedestrian? Is it a child, if someone is identified as being a child? Is it people standing at a bus stop on the side of the road? I will come to that soon when I share the concerns of one of my constituents who came to see me not about autonomous vehicles, but about an accident at a bus stop. These things have to be considered and accounted for when drawing up the algorithms that control automated cars—we cannot get away from that. Who will the algorithm protect in such circumstances? That is one of the challenges that came up when autonomous vehicles were being tested in Greenwich. When someone moved a chair and put it in front of the vehicle, the vehicle did not identify it. If it had been a child, the vehicle would have run them over.
We have to accept that we are going into no man’s land by advancing with this technology. We will need to scrutinise its use, which is why it is right that we are looking to set up a panel that will have oversight of this area and advise the Secretary of State. I accept what the Secretary of State has said: if somebody tinkers with the software, clearly they put themselves outside of their insurance policy and will be liable for any accident that occurs as a consequence. However, both I and my hon. Friend the Member for Warwick and Leamington (Matt Western) have mentioned the Horizon scandal. At the heart of that scandal was Fujitsu, which tried to hide the glitches in its software. We cannot run away from the fact that there is a distinct possibility that something like that could happen when we have automated vehicles that are controlled by software. We must have the ability to scrutinise that and to ensure that people can have confidence in what companies say about the software they develop for automated vehicles.
We are told that we will have these vehicles for 20 to 30 years in co-existence with driven vehicles. What is going to happen when accidents occur? I am sure we will be told, as we were told in 2018 with the Automated and Electric Vehicles Act, that insurance companies will pay up, that these matters will be sorted out later and that they have anticipated every circumstance. We hear that time and again with legislation, but its practical application is where we really find out what is going on. When a driven vehicle has a collision with an autonomous vehicle, will the assumption be that the autonomous vehicle is always right, that the driven vehicle must be wrong and that the accident must be due to human error? I am sure I will be told that we have allowed for that in the legislation, but I am also sure that once it is applied on the roads, this will become a big area of contention.
I am listening very carefully to the hon. Gentleman, and I am thinking about the aviation industry. Aeroplanes are very complicated technologies, yet aviation is one of the safest forms of travel, because each accident is investigated carefully to avoid a similar catastrophe. Does he think that similar structures for investigating accidents should be put in place as a safety mechanism?
Scrutiny of accidents is going to be important, because we will learn a lot. We can improve safety with this technology—there is no question about that. The question is about the moral argument when accidents do happen and how we choose how vehicles should behave in those circumstances.
A constituent has come to me about a tragic case of a child being killed at a bus stop. A lorry lost control and swerved into the bus stop, and the child could not escape the vehicle and was crushed. It is an absolutely tragic story. My constituent came to see me about designing bus stops to make them safer for people standing at the roadside. Having lost her child in such tragic circumstances, I commend her for her consideration in wanting to improve the situation for others. As it is rolled out, this technology could prevent vehicles from colliding with roadside structures such as bus stops, so I accept that it can improve safety. This is an example of where we might be able to meet my constituent’s desire to improve safety in such circumstances.
This technology will need a great deal of scrutiny. We will learn a lot from the application of this legislation as more and more automated vehicles enter our road network, and an advisory council to consider all aspects of the technology is absolutely necessary.
Clause 2 says that the Secretary of State must consult, but the list is very limited and puts businesses, including those that design the vehicles and draw up the algorithms, in prime position above road user representatives and other concerned individuals. The list needs to be much wider, and there needs to be a statutory body to provide oversight. We are on a steep learning curve and we will learn as we go. I accept that we cannot stand in the way of progress, but we must accept that there are serious safety questions that require answers. An advisory council of the kind that has been recommended is absolutely necessary.