Automated Vehicles Bill [Lords] Debate
Full Debate: Read Full DebateWera Hobhouse
Main Page: Wera Hobhouse (Liberal Democrat - Bath)Department Debates - View all Wera Hobhouse's debates with the Department for Transport
(9 months, 2 weeks ago)
Commons ChamberI will come to that in a second—it will become clear in the next section of my speech—but I can tell the hon. Gentleman that the Bill is about giving people choices. If people want, as many will, to carry on driving their existing vehicles in the traditional way, that is absolutely fine and no one is going to try to stop them. To be very clear, the hon. Gentleman can carry on driving for as long as he wants to and is safe to, and no one is going to try to stop him. Certainly, I am not going to try—I wouldn’t dare.
On the legal concerns—this will address the point about the driving test, too—the Bill redefines our legal relationship with road transport. As soon as someone turns on a self-driving feature, legal responsibility for how the car drives will transfer to an authorised self-driving entity, or ASDE—not a very catchy acronym, admittedly, but that is what they are called. That could be a manufacturer or a software developer but, crucially, it will not be the human driver, who will assume a new status. As a user in charge, they will still need to ensure that the car is roadworthy, and they will need to reassume control if necessary. That answers the hon. Gentleman’s question: someone will still need to be in possession of a full driving licence and able to reassume control of the vehicle if required, but they will be protected by law from any offences while the car is driving itself.
Some journeys, either in private cars or on self-driving transport, will be fully automated, and a human will never need to take control; they will be, in essence, a passenger. My hon. Friend the Member for North East Bedfordshire (Richard Fuller) mentioned the example of Waymo cars in the US. Those are operated as taxis, with no driver present, and the human is never expected to take control; it is classed as a “no user in charge” journey. In those circumstances, someone would not need a driving licence, because they would never be expected to drive the car, in the same way we are not expected to drive a taxi or private hire vehicle. Those legal concepts will have a seismic impact.
This is the future, and it is both quite exciting and quite scary. We have to get our heads around it and make sure that we get this right. On what the Secretary of State has just been describing, is it basically the difference between someone taking a taxi and driving their own car? If there is an accident in a taxi, the taxi company is responsible, not the passenger.
If someone is using a vehicle for a “no user in charge” journey for which they are, in effect, the passenger and there is an accident, it will be totally the responsibility, in all circumstances, of the person operating the vehicle. Where someone who is driving the car for part of the time switches on the self-driving features and something happens while those features are activated, that will be not their responsibility but that of the manufacturer or the software developer. If someone is in control of the vehicle and the self-driving features are not activated, they retain responsibility.
One of the things that we will have to do is educate people about the difference, and we are being clear to manufacturers that there is a big difference between a self-driving feature and driver assistance. Under driver assistance, the driver is still fully legally responsible for the vehicle, but with some technological help; when the self-driving features are activated, they no longer have legal responsibility.
Is there not potential for a legal conflict between a driver who says, “I was in self-driving mode,” and a company that says, “No, it was switched off”? Does the Secretary of State see that it might be very difficult to establish what happened in such circumstances?
Potentially, but that is exactly why the earlier question about data is very important. These vehicles generate a huge amount of data and one part of the authorisation process will be making sure that that data is properly managed and there is proper access to it by the investigators of any potential accident and the insurance industry to establish exactly what has happened in such circumstances.
The Liberal Democrats welcome the Bill because it takes the first step towards the creation of a framework within which automated vehicles can operate safely. The future of sustainable travel lies in such vehicles, and the UK now has a good opportunity to join the growing number of countries that are embracing this new technology. The tech sector in the UK is particularly strong, and the Bill should give confidence to investors if we are to develop a self-driving vehicle industry and take full advantage of its potential. A large part of that potential relates to road safety: there are still too many road accident victims, and I believe that automated vehicles can contribute significantly to reducing that number if we get this right. The Bill also has the potential to help us reach net zero. We may need to question, and reduce, individual car ownership in future if we want to hit our net zero targets, and automated vehicles may help us to do that.
However, the potential of this industry will only be realised if there is a high level of public confidence in the protections that the Bill gives to public safety—particularly the safety of other road users such as cyclists and pedestrians, who are more at risk than motorists. There is clearly scope for improving the safety of our roads, given that nearly 90% of traffic accidents are caused by human error. Many of the accidents that involve more vulnerable road users, such as cyclists, result from driver impairment or from drivers’ disobeying traffic laws.
Evidence emerging from trials of AVs in San Francisco relating to overall safety improvements is encouraging, but a report of just one thing going wrong will set back efforts to secure public confidence in the safety of these vehicles. It will be important to set out very clearly the scope of any trials in the UK. We may receive reassurances from the industry that the technology is being improved continuously, but we must set out our expectations of what the trials can and cannot achieve. No technology will ever be 100% safe. If there is an interaction between technology and the human being sitting in the car, there is the potential to override the system. The nature of that interaction is almost a philosophical question, which has not been entirely resolved today, but the Minister has been generous in allowing us to raise our concerns.
During the San Francisco trials, issues arose relating to AVs’ hindering emergency vehicles and stopping in cycle lanes, and those need to be addressed. Of course some issues are to be expected in trials, but a repetition of those incidents will damage public trust. People must be confident they will not be repeated on UK streets, and that will require a robust legal and safety framework which will also cover our trials.
The Liberal Democrats welcome the Government’s concession in changing the standard of safety for AV drivers so that they will have to meet or exceed the level of safety of careful and competent human drivers. The implications of that for driving tests have already been mentioned, and it is important for that discussion to continue. The Bill gives us a chance to improve the safety of our road networks for the long term, and we should see this as an opportunity to improve accessibility and safety for the public rather than just maintaining current standards.
Automated vehicles also require adequate infrastructure to support them. The poor state of UK roads has led to the highest number of pothole-related call-outs for the RAC in the last five years. Assurances must be given that improvements in road surfaces will be made before the roll-out of AVs. Will minimum standards for road quality be set for their use, and will local authorities be given the additional resources they will require in order to meet them?
Older and more vulnerable people are more reliant on taxis and private hire cars, a great benefit of which is a driver who can help them with access. The benefits of increased affordability that AVs may bring must not come at the cost of reduced access for disabled and vulnerable users, who will also require assurances about access on automated public transport if it is to be completely unstaffed. We have not talked enough about the human input into this brave new world of automated vehicles and about whether, for instance, someone will be available to assist a disabled person using such a vehicle.
Another area of concern, which has also been mentioned today, is the attention given to data protection in the Bill. It is of course essential that AVs can take in data for machine learning algorithms, which enable them to improve the way in which they navigate. However, a large number of parties will inevitably have access to the data. It will include personal information, including people’s faces. The overlap between commercial and personal data creates issues with access and storage. When data is shared between parties, including private companies, can we be sure that people’s personal data is not being monetised for commercial gain? The Government have not yet given adequate assurances that personal information will be protected.
What about insurance? Insurers have said that the data from AVs must be readily available to establish liability, but drivers must feel confident about how their data is managed. How the data is stored must be open and transparent, and it must be held independently. Establishing a clear path of accountability is essential for public confidence. Cyclists and pedestrians who do not hold personal insurance should receive fair and swift compensation when they are victims of an accident. Further assurance is needed that insurance companies will receive adequate guidance for such claims.
The Liberal Democrats welcome the Bill, but I urge Ministers to carefully review how it will impact on access for disabled and vulnerable transport users. I also encourage the Government to look further at data protection regulation. We must see this Bill as the beginning of a framework, not the end.
The hon. Member is giving a list of things that are absent from the Bill. In my constituency we have autonomous delivery robots, which are currently on pilot; they are not regulated at all in the UK. Is this not another area that the Bill should regulate, in addition to the issues she has raised?
We always try to solve other problems with Bills in front of us, so we have to be a bit careful not to hang something on this Bill that actually goes into other areas, but new technologies create new challenges for all of us. For example, there are safety issues with such deliveries, but that probably requires a separate Bill. However, it is important that the Government make sure that we have adequate regulation of new technologies.
As I said at the beginning of my speech, there are many exciting opportunities for technological change, and we must embrace them. If we do not, other countries will go ahead, and then we will have them anyway. We must take the public with us, understand the risks and make sure that the huge potential of AVs is seen for what it is, but we must avoid unintended consequences that will lead to the public not coming with us, so let us get this right. It is a great opportunity, and let us make sure that we minimise the risks.
I did not intend to give a speech in this debate—I just wanted to intervene— but as there were so few of us contributing, I thought I would make a short contribution at the end. I am grateful to you for allowing me to do so, Mr Deputy Speaker.
I accept that the time has come for this technology. As somebody who worked in the transport industry for many years prior to becoming a Member of Parliament, I accept that we cannot stand in the way of this technology and that, overall, our road network will be safer with the advance of autonomous vehicles. None the less, there will be occasions when accidents occur, and we have to accept that we will be legislating for how vehicles respond in those circumstances. At the moment, if an accident happens, it happens in real time and people behind the wheels of the vehicles make real-time decisions to try to minimise the impact. However, automated vehicles will have to be programmed in advance to respond in a particular way in certain circumstances—we cannot get away from that. The fact is that the people designing the algorithms will be doing so remotely and well in advance of any accident happening.
Who is the primary person to consider when an accident takes place? Is it the person or persons in the vehicle, or is it the pedestrian? Is it a child, if someone is identified as being a child? Is it people standing at a bus stop on the side of the road? I will come to that soon when I share the concerns of one of my constituents who came to see me not about autonomous vehicles, but about an accident at a bus stop. These things have to be considered and accounted for when drawing up the algorithms that control automated cars—we cannot get away from that. Who will the algorithm protect in such circumstances? That is one of the challenges that came up when autonomous vehicles were being tested in Greenwich. When someone moved a chair and put it in front of the vehicle, the vehicle did not identify it. If it had been a child, the vehicle would have run them over.
We have to accept that we are going into no man’s land by advancing with this technology. We will need to scrutinise its use, which is why it is right that we are looking to set up a panel that will have oversight of this area and advise the Secretary of State. I accept what the Secretary of State has said: if somebody tinkers with the software, clearly they put themselves outside of their insurance policy and will be liable for any accident that occurs as a consequence. However, both I and my hon. Friend the Member for Warwick and Leamington (Matt Western) have mentioned the Horizon scandal. At the heart of that scandal was Fujitsu, which tried to hide the glitches in its software. We cannot run away from the fact that there is a distinct possibility that something like that could happen when we have automated vehicles that are controlled by software. We must have the ability to scrutinise that and to ensure that people can have confidence in what companies say about the software they develop for automated vehicles.
We are told that we will have these vehicles for 20 to 30 years in co-existence with driven vehicles. What is going to happen when accidents occur? I am sure we will be told, as we were told in 2018 with the Automated and Electric Vehicles Act, that insurance companies will pay up, that these matters will be sorted out later and that they have anticipated every circumstance. We hear that time and again with legislation, but its practical application is where we really find out what is going on. When a driven vehicle has a collision with an autonomous vehicle, will the assumption be that the autonomous vehicle is always right, that the driven vehicle must be wrong and that the accident must be due to human error? I am sure I will be told that we have allowed for that in the legislation, but I am also sure that once it is applied on the roads, this will become a big area of contention.
I am listening very carefully to the hon. Gentleman, and I am thinking about the aviation industry. Aeroplanes are very complicated technologies, yet aviation is one of the safest forms of travel, because each accident is investigated carefully to avoid a similar catastrophe. Does he think that similar structures for investigating accidents should be put in place as a safety mechanism?
Scrutiny of accidents is going to be important, because we will learn a lot. We can improve safety with this technology—there is no question about that. The question is about the moral argument when accidents do happen and how we choose how vehicles should behave in those circumstances.
A constituent has come to me about a tragic case of a child being killed at a bus stop. A lorry lost control and swerved into the bus stop, and the child could not escape the vehicle and was crushed. It is an absolutely tragic story. My constituent came to see me about designing bus stops to make them safer for people standing at the roadside. Having lost her child in such tragic circumstances, I commend her for her consideration in wanting to improve the situation for others. As it is rolled out, this technology could prevent vehicles from colliding with roadside structures such as bus stops, so I accept that it can improve safety. This is an example of where we might be able to meet my constituent’s desire to improve safety in such circumstances.
This technology will need a great deal of scrutiny. We will learn a lot from the application of this legislation as more and more automated vehicles enter our road network, and an advisory council to consider all aspects of the technology is absolutely necessary.
Clause 2 says that the Secretary of State must consult, but the list is very limited and puts businesses, including those that design the vehicles and draw up the algorithms, in prime position above road user representatives and other concerned individuals. The list needs to be much wider, and there needs to be a statutory body to provide oversight. We are on a steep learning curve and we will learn as we go. I accept that we cannot stand in the way of progress, but we must accept that there are serious safety questions that require answers. An advisory council of the kind that has been recommended is absolutely necessary.