(8 months, 3 weeks ago)
Commons ChamberI did not intend to give a speech in this debate—I just wanted to intervene— but as there were so few of us contributing, I thought I would make a short contribution at the end. I am grateful to you for allowing me to do so, Mr Deputy Speaker.
I accept that the time has come for this technology. As somebody who worked in the transport industry for many years prior to becoming a Member of Parliament, I accept that we cannot stand in the way of this technology and that, overall, our road network will be safer with the advance of autonomous vehicles. None the less, there will be occasions when accidents occur, and we have to accept that we will be legislating for how vehicles respond in those circumstances. At the moment, if an accident happens, it happens in real time and people behind the wheels of the vehicles make real-time decisions to try to minimise the impact. However, automated vehicles will have to be programmed in advance to respond in a particular way in certain circumstances—we cannot get away from that. The fact is that the people designing the algorithms will be doing so remotely and well in advance of any accident happening.
Who is the primary person to consider when an accident takes place? Is it the person or persons in the vehicle, or is it the pedestrian? Is it a child, if someone is identified as being a child? Is it people standing at a bus stop on the side of the road? I will come to that soon when I share the concerns of one of my constituents who came to see me not about autonomous vehicles, but about an accident at a bus stop. These things have to be considered and accounted for when drawing up the algorithms that control automated cars—we cannot get away from that. Who will the algorithm protect in such circumstances? That is one of the challenges that came up when autonomous vehicles were being tested in Greenwich. When someone moved a chair and put it in front of the vehicle, the vehicle did not identify it. If it had been a child, the vehicle would have run them over.
We have to accept that we are going into no man’s land by advancing with this technology. We will need to scrutinise its use, which is why it is right that we are looking to set up a panel that will have oversight of this area and advise the Secretary of State. I accept what the Secretary of State has said: if somebody tinkers with the software, clearly they put themselves outside of their insurance policy and will be liable for any accident that occurs as a consequence. However, both I and my hon. Friend the Member for Warwick and Leamington (Matt Western) have mentioned the Horizon scandal. At the heart of that scandal was Fujitsu, which tried to hide the glitches in its software. We cannot run away from the fact that there is a distinct possibility that something like that could happen when we have automated vehicles that are controlled by software. We must have the ability to scrutinise that and to ensure that people can have confidence in what companies say about the software they develop for automated vehicles.
We are told that we will have these vehicles for 20 to 30 years in co-existence with driven vehicles. What is going to happen when accidents occur? I am sure we will be told, as we were told in 2018 with the Automated and Electric Vehicles Act, that insurance companies will pay up, that these matters will be sorted out later and that they have anticipated every circumstance. We hear that time and again with legislation, but its practical application is where we really find out what is going on. When a driven vehicle has a collision with an autonomous vehicle, will the assumption be that the autonomous vehicle is always right, that the driven vehicle must be wrong and that the accident must be due to human error? I am sure I will be told that we have allowed for that in the legislation, but I am also sure that once it is applied on the roads, this will become a big area of contention.
I am listening very carefully to the hon. Gentleman, and I am thinking about the aviation industry. Aeroplanes are very complicated technologies, yet aviation is one of the safest forms of travel, because each accident is investigated carefully to avoid a similar catastrophe. Does he think that similar structures for investigating accidents should be put in place as a safety mechanism?
Scrutiny of accidents is going to be important, because we will learn a lot. We can improve safety with this technology—there is no question about that. The question is about the moral argument when accidents do happen and how we choose how vehicles should behave in those circumstances.
A constituent has come to me about a tragic case of a child being killed at a bus stop. A lorry lost control and swerved into the bus stop, and the child could not escape the vehicle and was crushed. It is an absolutely tragic story. My constituent came to see me about designing bus stops to make them safer for people standing at the roadside. Having lost her child in such tragic circumstances, I commend her for her consideration in wanting to improve the situation for others. As it is rolled out, this technology could prevent vehicles from colliding with roadside structures such as bus stops, so I accept that it can improve safety. This is an example of where we might be able to meet my constituent’s desire to improve safety in such circumstances.
This technology will need a great deal of scrutiny. We will learn a lot from the application of this legislation as more and more automated vehicles enter our road network, and an advisory council to consider all aspects of the technology is absolutely necessary.
Clause 2 says that the Secretary of State must consult, but the list is very limited and puts businesses, including those that design the vehicles and draw up the algorithms, in prime position above road user representatives and other concerned individuals. The list needs to be much wider, and there needs to be a statutory body to provide oversight. We are on a steep learning curve and we will learn as we go. I accept that we cannot stand in the way of progress, but we must accept that there are serious safety questions that require answers. An advisory council of the kind that has been recommended is absolutely necessary.