Automated and Electric Vehicles Bill (Third sitting) Debate
Full Debate: Read Full DebateGreg Knight
Main Page: Greg Knight (Conservative - East Yorkshire)Department Debates - View all Greg Knight's debates with the Department for Transport
(7 years, 1 month ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship once again, Sir Edward. I have had a number of informal chats with the Minister as we have bumped into each other while wandering around the House. I appreciate his approach to the Bill. My amendments are genuinely to try to probe the area, which I find fascinating, of the interaction between artificial intelligence and human behaviour. Nowhere more than in our transport systems will this become more prevalent over the coming years. My amendments are to probe the areas where I think that that comes into sharp focus.
When we boil it down, we are legislating for vehicles that are driven by computer software, as we heard in the evidence. We heard from the witnesses on Tuesday that we are legislating exclusively for tier 4 and tier 5 of the five tiers. The tiers start with driver-assisted systems such as braking, steering and parking, through to automated vehicles that can switch between being driven by a human and by software at tier 3, which overlaps into tier 4, and to tier 5, which is purely automated vehicles. The legislation really challenges us as legislators, because by simplifying the insurance system we are being asked to enable our roads to become laboratories to sharpen that technology. We heard clearly in the evidence that there were different attitudes to what is taking place. When asked about tier 5 technology, Mr Wong, from the Society of Motor Manufacturers and Traders, said:
“As to when those level 5 vehicles without steering wheels are capable of performing end-to-end journeys—from my house in the village to my office in the city—that is anybody’s guess. That will probably be some time in the 2030s. It is quite complex.”––[Official Report, Automated and Electric Vehicles Public Bill Committee, 31 October 2017; c. 43, Q98.]
However, we then heard from Mr Boland of Five AI, who told us that automated vehicles would be on our roads in 2019, albeit in an experimental fashion.
This is a big challenge for us. We need to consider the software in great detail, and the Secretary of State needs to be given the power to set and oversee certain standards. Mr Wong referred to the report written by the Ethics Commission on Automated Driving for the German Federal Ministry of Transport and Digital Infrastructure. I am a bit of an anorak, so I have started reading that report, although I have not got through all of it in the last 48 hours. It makes fascinating reading. The commission’s approach is that the technology is there to improve safety, whereas our attitude seems to be that it is a technological advance to help industry, and that improving safety and social inclusion will be a by-product a long way down the line.
The operation of the software raises some ethical issues. I asked the witnesses about how the software would perform and take decisions when an accident is imminent. For instance, imagine a four-year-old toddler walking in front of a vehicle that cannot stop to prevent a collision. To the left is oncoming traffic, with the risk of a head-on collision; to the right are perfectly innocent bystanders on the pavement or at the bus stop—those are the vehicle’s options. Mr Wong noted that this was the “classic trolley problem” referred to in the German ethics commission’s report. The commission’s conclusion was that it is simple to make a decision when the choice is between property damage and human injury, but when the choice is between different types of injury to different road users or innocent pedestrians who are not part of the scenario, we move into a completely new area of morals and ethics. We have to be prepared for that; these situations will take place on our streets, and we need to legislate for them. We should give ourselves the opportunity to oversee this software before it is allowed on the streets. Amendment 8 would give the Secretary of State power over the software’s approval, and new clause 11 would set out the approval criteria.
Does not clause 1(1) already cover what amendment 8 seeks to achieve? Paragraph (b) requires that the Secretary of State be satisfied that vehicles are
“designed or adapted to be capable, in at least some circumstances or situations, of safely driving themselves.”
In making that decision, surely the Secretary of State would take into account the nature of the software.
We would hope so. In the general terms in which the Bill is drafted, that is quite possible. Amendment 8 is a probing amendment, and I will not press it to a vote, but this is an area that as legislators we need to scrutinise. The software is key. That is what will be making the decisions and that is what will be driving the vehicle.
We seem to have started this discussion in terms of this being a mechanical problem about how to develop a piece of technology that can read all the different scenarios on our roads and react accordingly, but looking at the research—vehicles’ different speeds, any delay in the transition between a driver and an automated vehicle—an awful lot of the issue around the software is not referred to in the Bill. I am attempting to draw attention to that and to put in the Bill that it is the crucial area of the technology and we should pay attention to it.
I have considered that and I think that is the assumption. My right hon. Friend has well exposed the logic that underlies the current drafting, and it is in error, in my view, because although of course the material moment is the moment of the hypothetical accident, the cause of the accident is the material question from the point of view of the operation of our insurance system, and if the cause of the accident was a bad decision by the person, there is an illogic that will eventually undo all the good we are trying to do if nevertheless the insurer of the vehicle has strict liability. The fact that it may have been five, 20 or 55 minutes before the accident that the person handed over control to the vehicle is irrelevant if the basis on which the person handed over control was wrong and the person made the wrong decision. It seems to me that the question we need to address is this: is it possible that the person should have made such a wrong decision, or have we eliminated that possibility? That is what I want to get on to, because that is where clause 1(1)(b) needs to have a (c).
Is it not highly likely that this sophisticated vehicle will prevent the driver from seeking to put the vehicle in automated mode if it is unsafe to do so? It will reject the request.
I am grateful to my right hon. Friend for asking that question because it leads me to exactly the point I want to raise in relation to 1(1)(a), (b), and, as I think it may need to be, (c).