House of Commons (24) - Commons Chamber (10) / Westminster Hall (6) / Written Statements (6) / Ministerial Corrections (2)
House of Lords (9) - Lords Chamber (9)
(4 years, 9 months ago)
Lords Chamber(4 years, 9 months ago)
Lords ChamberMy Lords, I notify the House of the retirement with effect from today of the noble Lord, Lord Elystan-Morgan, pursuant to Section 1 of the House of Lords Reform Act 2014. On behalf of the House, I thank the noble Lord for his much-valued service to this House.
(4 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government what progress has been made in reducing the disability employment gap.
My Lords, in begging leave to ask the Question standing in my name on the Order Paper, I declare my interest as a vice-president of the National Autistic Society.
My Lords, the employment rate for disabled people stands at 53.2%, having increased by 9.8 percentage points over the past six years. The employment rate gap between disabled and non-disabled people has fallen by 5.6 percentage points over the same period.
I welcome, as I am sure the whole House does, the improvement the Minister just told us about, but there has been no increase whatever in the number of autistic people in work—for the last 10 years. Just 16 in every 100 people who are autistic are in full-time employment. Addressing the lack of understanding about autism across business and industry is key to trying to solve this problem. Will the Government establish an information hub, providing employers with support and information to improve recruitment of autistic people? Could I tempt her to be even more daring and perhaps consider creating an autism accreditation scheme so that participating companies get full recognition for the efforts they put in?
The noble Lord makes very accurate and real points. I spoke to the National Autistic Society this morning. Some 16% of autistic adults are in work and 32% of them are in some kind of paid work, but the real statistic is that 77% of unemployed autistic adults want to work. The noble Lord rightly pointed out that we must get to that figure. The disability hub is a great idea. I will go back to the department with yet another idea—their eyes roll now when I walk in, but I will do it. I will not be put off by that. I can confirm that the Government are also working with the Supported Business Alliance and the British Association for Supported Employment to help them develop a new quality mark for supportive businesses and develop a long-term element of access to work to continue the support. However, there is no doubt that we have a lot more to do and I will take both those ideas back to the department.
My Lords, we do not have any flesh on the bones of the national disability strategy yet. There are many issues involved in closing the disability employment gap: suitable housing, adequate care and better education opportunities, to name but a few. Will the Minister consider hosting a round table with Members of this House who have expertise in this subject, so that we have as much consensus as possible going forward?
How can I say no to the noble Baroness? That is another great idea. It fits very well with the national disability strategy, which will, I am pleased to say, be developed with disabled people and disability charities and organisations, and will cover the areas outlined by the noble Baroness—housing, education, transport and jobs—so that people can improve their lives. I will be delighted to go back to the department, not to suggest a round table but to say that we are having one.
My Lords, can I add another idea to my noble friend the Minister’s list? It is testament to the influence of your Lordships’ House that only last week I introduced a Bill on exactly this issue, which already has the backing of major corporates such as EY and Enterprise Holdings. They know that there must be a level playing field for rewarding and incentivising best practice. Will the Minister take this idea back not just to her department but to the Government as a whole, for incorporation in the forthcoming employment Bill, so that the mandatory gender pay gap reporting duty is extended to other protected characteristics, including disability?
I am going to start singing “I’ve Got a Little List” in a minute. I congratulate the noble Lord on his tenacity in this area. His work on Able to Excel and his Private Member’s Bill were excellent. In 2018, the Government published a voluntary reporting framework on disability, mental health and well-being in the workplace, aimed at large employers—those with more than 250 employees. In November, we announced the new level 3 of Disability Confident. We must work with businesses to crack that 77% of people who want to work. Employers create jobs; we must work closely with them. My noble friend’s work will help with this. I will arrange for him to come in and do the sell on that one.
My Lords, in following up the point made by the noble Lord, Lord Touhig, about information for employers, does the Minister agree that often we are not asking for something profound or difficult? It is just tackling basic information about the subject. We could do a great deal with very little effort.
I could not agree more with the noble Lord. I was on a project recently where a young girl with bags of potential who had epilepsy thought she would never get a job because she thought that nobody would risk having her in their establishment. The people running the project found a lady who ran a business and who was epileptic. She said, “You send her down to me.” She is now employed as a legal secretary. That did not take a great deal of effort. The way for us to make headway with those statistics is by remembering that everybody is an individual and by spending time working out a strategy for the individual.
My Lords, in this important area Britain is proud to be a leader in many ways—in technology, computing and so on—and many of these projects, which are transforming the lives of some people with certain kinds of disabilities, have been run across Europe, so there are worries that some of these projects may not continue. Can the Minister assure the House that priority will be given to helping this world-leading development continue? It is making an impact on people with disabilities not only in our own country but right across the world as the technology is rolled out.
Whether our people have a disability or they are well able, the jobs that we want them to get into will focus on technology in the future. I cannot give a categoric assurance that those projects will continue, but I can give a categoric assurance that we will continue to focus on the tech industry. I will go back and ask another question and, if I survive that, I shall write to the right reverend Prelate and let him know the outcome.
(4 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government whether they intend to continue to provide funds for Bahrain through the Integrated Activity Fund following the decision of the Fourth Supreme Court of Appeals in Bahrain to uphold the death sentence in respect of Mohamed Ramadan and Hussain Moosa.
My Lords, progress on human rights reform has been made, but there remains more to do. The United Kingdom is committed to supporting Bahrain-led reform, including through carefully targeted assistance and private and public engagement. We are clear that disengaging or criticising from the sidelines is less likely to deliver the positive reform that Bahrain and the international community seek.
I have been given the official dossier from the Bahraini Special Investigations Unit, which reveals that its investigation into the torture allegations of the death row inmates Mohamed Ramadan and Hussain Moosa was inconsistent, contradictory and contravenes international standards. The dossier shows that the SIU, which the noble Lord maintains is transparent, is quite the opposite and implicated now in human rights abuses. In the light of this, will the Minister agree to a meeting with me and representatives from the Bahrain Institute for Rights and Democracy to discuss this dossier, the IAF funding and why these two men transpire to have been deemed guilty by the Bahraini authorities even before they went to the dock?
My Lords, I am always happy to meet and we can look into that. On the noble Lord’s more specific point, I beg to differ. It was because of the United Kingdom’s investment in and provision of technical support, particularly for the oversight authorities, that the cases of Hussain Moosa and Mohamed Ramadan were looked at again. The noble Lord shakes his head but that is a fact. Of course, we regret the fact that the death penalty prevails as a form of sentencing in Bahrain. In that respect, I assure the noble Lord that I, the ambassador and my right honourable friend Dr Murrison, the Minister responsible, have made it known that we do not believe the death sentence should prevail, and we will continue to make that case to the Bahraini authorities.
My Lords, the noble Lord makes the point about positive engagement and seeking change, and I know that the UK is funding the alternative non-custodial sentencing programme. However, we now have a report from eight UN experts on this programme, saying that it discriminates against human rights defenders. What does the noble Lord say about that when he is the Minister responsible for human rights and his own programmes are discriminating against them?
Given his background, I am sure the noble Lord will know that we worked directly with UNDP on that programme and we have been working on this issue. He raised the issue of alternative sentencing and we have seen positive outcomes: up to 1,000 people have now been looked upon for alternatives to prison sentences. The noble Lord rightly raises genuine concerns about human rights and those continue. As I said in my original Answer, we are far from where we want to be but our continual engagement with the Bahraini authorities is producing results.
Does the Minister agree that the British judicial system is one of the best in the globe, if not the best, and that many countries have benefited through training from our judicial systems? Can he offer additional training to some of the countries that we know well have uncertain outcomes of their judgments? I know how much judges of the countries I serve in welcome visits, support and training from our judicial system.
My noble friend makes a helpful suggestion, but we are seeking to do more. In this respect, judges from Bahrain have visited Crown Courts and magistrates’ courts in the United Kingdom and we continue to engage with the judiciary on this point.
My Lords, in the past hour the UN has called on Bahrain to prevent the execution of these two men, saying:
“Admission of evidence obtained under torture into any proceeding violates the rights to due process and fair trial and is prohibited without exception. If carried out in these circumstances, the death penalty would constitute an arbitrary killing.”
Does the Minister agree?
My Lords, I am aware of those reports, but I have not seen the full detail. On a previous occasion last year, when the death penalty was also passed, I made a direct intervention. Unfortunately, that death sentence was not reversed. Subsequently, at the Human Rights Council, we made specific reference under Item 2 on the death penalty and will continue to do. I will review the report the noble Baroness mentioned on my return. It remains the consistent position of the United Kingdom Government that the death penalty should not be part of sentencing policy. We continue to make that case with Bahrain and elsewhere.
(4 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government what plans they have to review the rules for referendums.
My Lords, the rules on referendums are set out in the Political Parties, Elections and Referendums Act 2000, and the Government have no plans to review or change them. An Act of Parliament is required before any UK-wide referendum can be held. This means that all referendum legislation is thoroughly debated, and Parliament can decide to legislate for different rules for each referendum.
I do not like referendums for governing this country, but if we are going to have them, particularly on major issues, is it not important that we have a minimum turnout and higher level of approval in order to make major changes? The horrors of the way that politics and economics were divided in this country on Brexit were bad precisely because it was a narrow division, and it would be even worse if the same happened in Scotland or, particularly, in Northern Ireland. We need to think about minimum turnout and a maximum in order to make a major change acceptable. That needs to be agreed, and everybody has an interest in agreeing it.
My Lords, I recognise the noble Lord’s consistent interest in this topic. However, Parliament would need to go down that kind of road with a good deal of care. This country has no history of applying thresholds to the making of laws, for example, or the electing of our representatives. Both those things require a simple majority. To start applying special thresholds for referendums would require special and clear justification.
My Lords, is my noble friend aware that, on 24 April, I will be producing a Bill for the House to discuss the very issues raised by the noble Lord, Lord Soley? Will my noble friend commit to approaching that debate with an open mind, bearing in mind that referendums are incompatible with representative parliamentary democracy?
My Lords, I look to debating my noble friend’s Bill when it comes before us for Second Reading. I think it would be premature for me to set out the Government’s position on the Bill today. We will do so, as we do with all Private Members’ Bills, at the Second Reading, but I can assure my noble friend that we shall approach it with an open mind.
My Lords, do the Government accept that the worst failures with the 2016 referendum were concerned with transparency and funding? We still do not know who paid how much and for what and whether some significant sums were from illegal foreign sources. Strong recommendations have been made by a number of official bodies that the Government need to act on this, yet we have had no response. The long-awaited ISC report on Russian influence may be very relevant here. When will the Prime Minister authorise its publication?
My Lords, the first duty of government is to safeguard the nation, and we treat the security and integrity of our democratic processes extremely seriously. We have no evidence to show that there was any successful interference in the EU referendum. However, as I said, we take any allegations of interference in our democratic processes extremely seriously. My understanding is that the report referred to by the noble Lord has been released by the Prime Minister.
When we look at the last three referenda—on the voting system in 2011, on Scottish independence in 2014 and on EU membership in 2016—one of the bizarre characteristics is that, before the ink was dry on the results of those referenda, the losers were campaigning for a second referendum to reverse the first one. Therefore, should one characteristic of future referenda not be a minimum interval before the same question is asked again? Otherwise, you have an absurd situation where referenda designed to be for a generation are in danger of being reversed within six months.
Does the Minister accept that there has been some change since 2016? Although it is true that we have limited information about the success of the campaigning from without the UK on that occasion, and limited information about what happened in the election of President Trump, there is today much more evidence about disinformation campaigning and there are many reports, including by the Oxford Internet Institute, which give us great cause for worry about the future of democracy. Does the Minster agree?
My Lords, we are very concerned and absolutely determined to protect the integrity of our democracy and our elections. As I have said, we are doing that by addressing in particular the mechanisms for electoral fraud through the introduction of voter ID and by banning postal vote harvesting. We have already announced a range of measures to strengthen and protect our democratic processes. These include commitments to launch a consultation on electoral integrity and to implement a digital imprint regime for online election material.
My Lords, I express considerable sympathy with the sentiments expressed by the noble Lord, Lord Soley, about referendums. I have long held considerable doubts about using a 50+1 mechanism for bringing about significant constitutional change. I am also incredibly fearful of using that method to bring an end to the union with Northern Ireland and establish a united Ireland; the consequences are likely to be severe. Will my noble friend the Minister look again at thresholds in referendums? There is a precedent in 1979, when the referendum in Scotland required not just a majority of those voting at the ballot box but 40% of the electorate as a whole to back the proposals.
That is true, but we have never gone down that road in any of the subsequent referenda. There would be serious challenges in doing so. First, Parliament would need to decide what level of participation confers legitimacy; I do not think that is a straightforward issue at all. If one had a threshold related to voter turnout, the inflexibility of such an arrangement could easily prove counterproductive and have the paradoxical effect of equating non-participation with no vote, because low levels of participation can void a given result. That could cause a great deal of disquiet among the public.
My Lords, the Minister said that the Prime Minister has released the report on Russian potential interference in our electoral process. Can he say whether it has been published or, if it has not, when it will be published? If it has been published, can he make sure that copies are available in the Printed Paper Office?
My Lords, the Minister mentioned the digital imprint scheme that was announced last May. However, the Government announced that they could not possibly bring it in for the election that we just had in December. When will the Government bring in this imprint regime that will allow voters to have financial transparency and assess the credibility of online advertising? Will it be in place for the elections in May?
We are absolutely clear that we want to introduce that mechanism as quickly as we can. It will ensure greater transparency. As the noble Baroness said, it will make it clearer to the electorate who has produced and promoted online political materials. I would love to be more specific about the timing. Unfortunately I cannot, beyond saying that we will make an announcement in due course and will do so as speedily as we possibly can.
(4 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government what (1) ships, and (2) other vehicles, will be used to strengthen the Fishery Protection Squadron; and when such vehicles will be ready for active service.
My Lords, in England the Marine Management Organisation has contractual arrangements with the Royal Navy for two offshore patrol vessels. The Royal Navy will be increasing its offshore vessels from four to eight over 2020. In addition, the MMO has two vessels and two aircraft as well as 22 patrol vessels from the inshore fisheries conservation authorities at its disposal. The MMO works closely with the devolved Administrations, which have their own fisheries enforcement assets.
I thank the Minister for that very full response. It is important that we get this right. We have 80,000 square miles of water to look after and as experience shows, that sometimes going wrong; some 28 frigates were involved in the cod wars. As an aside, I gather that the Government are reducing our number of frigates to nine by 2036, which is a bit of a shock.
My questions relate to the OPVs. We will find them difficult to man because they are being run extra to what was originally intended. Has there been any consideration of using RNR crews to man them and tying those crews specifically to RNR units? Is HMS “Clyde” going to be used? Lastly, will we get the MMO co-located with the NMOC so that they can co-ordinate these operations?
My Lords, I am answering for the Government but from Defra rather than from the Ministry of Defence. I shall run through the ships because I think it will be helpful. HMS “Forth”, HMS “Medway”, HMS “Trent”, HMS “Severn”, HMS “Tamar” and HMS “Spey” are either in operation or coming forward. With regard to HMS “Clyde”, the lease ends on 31 March 2020. So, as I have outlined, this will be additional to HMS “Tyne” and HMS “Mersey”. They are specifically directed to help us with fisheries, and those ships will be engaged in a number of duties.
On the point about co-ordination, as I have mentioned before, one of the advances is the Joint Maritime Operations Coordination Centre, which exists precisely to ensure that we optimise and co-ordinate the development of UK maritime assets across government agencies, including the seagoing craft owned by Border Force, the Royal Navy, the Maritime and Coastguard Agency, the Association of Inshore Fisheries and Conservation Authorities, the MMO and others. I will look at all the points the noble Lord has made, but there is a lot more co-ordination. In addition, the MMO now has 75 marine enforcement officers working with the Royal Navy.
My Lords, does my noble friend agree that the key to fisheries protection is to catch foreign vessels in the act of fishing? To what extent will the Government use remote electronic monitoring, and are they considering making this a qualification for issuing a licence to foreign fishermen?
My Lords, access for foreign vessels will be subject to negotiation but clearly, as the Fisheries Bill states, they will require a licence. One important additional point is modern technology. A monitoring system has been in force for vessels over 12 metres since 2013, and we will be introducing VMS for vessels under 12 metres as well so that we get a more accurate picture of fisheries’ location and activity. The noble Lord and I went up to Newcastle to see the MMO. It can detect all vessels in operation in our waters, so that we are in a better position to ensure that our waters are properly fished.
My Lords, can the Minister inform us how the Scottish waters, under the Scottish Fisheries Protection Agency, will be integrated with the English and Welsh, and even Irish, waters? How will the three Scottish fishery protection vessels be integrated with MMO?
As I said in my first reply, co-ordination and collaboration with all the devolved Administrations—indeed, the four fisheries administrations —is absolutely key. Marine Scotland is represented on JMOC. In addition to the three vessels referred to by the noble Earl, it has two aeroplanes for aerial surveillance. The point is that there is collaboration with all four fisheries administrations to ensure that all UK waters are better protected.
My Lords, does the Minister agree that fishing vessels are very sophisticated nowadays? They know when a large, grey naval vessel is about to go over the horizon, so surely, exactly as the noble Baroness, Lady McIntosh, said, we must put more investment into electronic surveillance—aerial surveillance and satellite surveillance. We must also ensure that all vessels fishing in UK waters are on an equal footing, and that all comply.
This issue goes back to the very essence of sustainability and the reason why we need to do this. Heightened surveillance is in the long-term interests of the fishing fleets—for all vessels, whether they are foreign and subject to negotiation, or our own. It is about ensuring that sustainable stocks are in our waters and are fished properly. That is why, as I outlined, we have the electronic reporting and data system, the vessel monitoring system and even more innovative technologies to complement what we already have. This issue is really important.
My Lords, have the Government thought of ordering these new fishery protection vessels from Appledore shipyard, which needs some more orders? I believe that it built some vessels for Ireland. If not, where are these new vessels being procured from?
My Lords, I cannot speak of Appledore, although I have heard of it, but under the terms of the Fisheries Bill there will be opportunities for varying grants, including for port infrastructure. We will clearly need to think about this area because we want to have vibrant coastal communities, not only through vessel repairs and construction but by having vibrant fishing fleets in sustainable waters.
(4 years, 9 months ago)
Lords ChamberMy Lords, I beg to move that this Bill be now read a second time.
(4 years, 9 months ago)
Lords ChamberMy Lords, I would like to put on record the considerable disappointment on these Benches that the Secretary of State for Digital, Culture, Media and Sport will not be making an Oral Statement on the Government’s initial response to the White Paper on online harms. I seek an assurance from the Government Chief Whip, or indeed the Government Deputy Chief Whip, that government time will be made available for a full debate on the response to the White Paper.
My Lords, I note what the noble Lord has to say and I will discuss it with my noble friend the Chief Whip.
Motion
That the House do now resolve itself into Committee.
My Lords, I beg to move that the House do again resolve itself into a Committee upon the Bill.
My Lords, I shall speak also to Amendment 26. These are probing amendments about the phrase, in lines 8, 6 and 16 on page 64,
“the person is controlling the unmanned aircraft”
and seek the Minister’s response to a query I raised at Second Reading as to whether it would encompass all instances concerning an airborne unmanned aircraft where a constable required a person to ground it. As unmanned aircraft and drone technology advances, there may be pre-programmable types that, once airborne, will no longer be under active control from the ground.
As we advance into 5G, it might be possible for two or more individuals to have apps on their smartphones able to handle more than a single drone and passing control of them from one person to another. On a bigger scale, and as we know, this is what happens now when RAF operators controlling an RAF Predator UAV in the air over Syria from their base station in the United States pass monitoring and control to another operational team at RAF Waddington in the UK. Such control-sharing activity, scaled down, must be widely available soon, if it is not already.
With an app on a smartphone, I and many others can already turn lights or other devices on or off in our home at any time and from anywhere in the world with a wi-fi link. It will surely be possible for AN Other on the ground to switch from one UA onboard programme to another with just a smartphone. Noble Lords may have further suggestions of how and in what way drone and other unmanned aircraft capabilities will advance.
My amendment seeks to probe whether the present wording of Schedule 8, about an individual “controlling”, is sufficiently embracing to meet present and future possibilities of unmanned aircraft operational misuse that a constable wants to stop. The amendment would cover more than in-hand control while airborne, which smacks merely of attempting to deal with the single hobby-type user. Until an incident has been investigated, it may not be clear whether the operator is just a lone nuisance type, as may have happened at Gatwick, or a member of some terrorist team with advanced technology at their disposal. In other words, is the present wording of the Bill sufficiently comprehensive for a constable to act to cover all types of possible future operation that could be unlawful? Indeed, what should the constable be required to do if the operator is not physically controlling the flight of the UA? Perhaps this, too, needs to be covered for completeness, for I doubt that even my amendment would be adequate.
This is my second amendment and, as I mentioned, it is purely probing, to seek a response from the Minister to my concern that the present phrasing may be overly restrictive and so inadequate. I beg to move.
My Lords, I regret that I was unable to play a part at Second Reading, or indeed earlier in Committee, but I have a professional background in aviation, which some noble Lords will know about and which is declared in the register of interests, so I was particularly interested in the noble and gallant Lord’s amendment.
One thing that the civilian helicopter community does is patrol pipelines for gas, oil and all sorts of other things. Something that has begun to worry some of us is that a helicopter, for example, following a pipeline to inspect it and ensure that it subscribes to all the parameters the oil company wants of it might meet either a drone coming the other way—because drones can do that job—or a drone that is crossing the route because it is doing something else. If the necessary controls are not there, how can we ensure that the conflict is removed? Who will have responsibility for it? If the drone is autonomous and not within the geographical boundaries that have been set for it, where does responsibility lie?
These are real issues and it is the responsibility of all of us in aviation to ensure that airspace is properly managed. It concerns me, as chairman of an organisation that flies aircraft—helicopters, particularly—on these pipeline patrols, that a drone coming the other way, or crossing a pipeline and not under adequate control, could cause an accident. I hope that my noble friend will be able to reassure me.
My Lords, I think the House knows that I used to be an RAF pilot. I express some disappointment that the clerks’ department, somewhere along the line, did not add my name to this amendment and a number of others—but I have accepted the apologies of that department.
There is a vast difference between “in control” and “controlling”. I live on a hill in Sandy, Bedfordshire, and so far I have collected two drones that were, by definition, very close to being over the 400 feet and certainly not in the line of sight. I think it is very important that we differentiate between those who are actually flying the drone and those who might technically own the drone or control the company that is flying the drone, or some other definition. I hope that my noble friend on the Front Bench will recognise that this is not a superficial difference but a very significant one and that we must make sure that there is a clear definition. I thank my noble and gallant friend for raising the matter now.
My Lords, there is a remarkable similarity between the discussions on this amendment and the discussions we have had over the years on self-driving, autonomous cars. The only difference is that this is three-dimensional and the other one is generally two. The noble and gallant Lord, Lord Craig, and the noble Lord, Lord Glenarthur, both gave examples of a question I have long had. The noble Lord, Lord Glenarthur, mentioned two drones meeting over a pipeline or something, but the problem remains: how does a constable identify the person who is in control, or whatever? He is sitting in his car with his machine—or however he is going to do it—but how will he identify that? He cannot really arrest either the drone or the person unless he can identify them first. I hope that the noble Baroness will be able to explain this rather simple bit of logic which has escaped me so far.
My Lords, I thank the noble and gallant Lord, Lord Craig of Radley, for introducing this small group of amendments and giving us the opportunity to probe this wording, because it is incredibly important that we understand that the wording is fit for purpose. While I understand the intention behind his amendments, after careful consideration the Government believe that the existing wording in paragraph 1 of Schedule 8 regarding a person or persons controlling an unmanned aircraft is fit for purpose in relation to both manual and pre-programmed operations.
On Amendment 24, regarding the power for a constable to require a person to ground a UA—unmanned aircraft—a constable could exercise this power in relation to a UA performing a manual or pre-programmed operation if they had reasonable grounds for believing a person or small group of persons to be controlling that aircraft. Where this reasonable belief exists, the constable could require a person to ground the UA regardless of whether it was pre-programmed or not— hence the existing wording is sufficient for the power to be effective in the circumstances envisaged by the noble and gallant Lord.
A similar issue arises in Amendment 26; again, “controlling” refers to the UA when it is being flown either manually or in a pre-programmed mode if it is capable of that. It is therefore our view that the distinction that the amendment seeks to make would have no discernible benefit, since the description implies a person controlling a UA in line with the existing wording in the Bill. However, the Government recognise that UA technology is constantly evolving, and we will continue to keep our policies under review to ensure that they remain fit for purpose.
On the point made by my noble friend Lord Glenarthur about helicopters and pipelines, he is quite right that unmanned aircraft will increasingly be used for tasks such as patrolling pipelines, railways and all sorts of other things. However, under the current regulations drones should not fly over 400 feet and must remain within line of sight—to go beyond line of sight is against the regulations. They must have permission to do either of those two things. To get that permission, one would assume that those operating the helicopter would be aware that there might be drones operating in that area.
On the point made by the noble Lord, Lord Berkeley, about identifying the person, the constable must have a reasonable suspicion that the person is controlling the unmanned aircraft. That is not infallible, but a reasonable suspicion is not certainty. Therefore, given that the drone must remain within line of sight, a person will probably be able to be seen.
I hope that, based on this explanation, the noble and gallant Lord will feel able to withdraw his amendment.
I thank the Minister for her reply, which I shall obviously want to look at. I am still left very unclear about the depth of thought that has been given to this. She talks about situations where somebody is obeying the law and this does not matter, but I am concerned about the individual who is not obeying the law—who is flying above 500 feet and beyond sight of their drone. It seems to me that more is required than is presently available in the Bill—but at the moment I beg leave to withdraw my amendment.
My Lords, this too is a probing amendment, as it does not fully capture the intention behind the issue that I raised during Second Reading on confiscation of equipment. The Minister at the time led me to believe that she would seek to answer in correspondence the issue that I raised, but the letter that Members have received does not refer to it. I do not blame her; perhaps she might deal with it today.
I have carefully read the Explanatory Memorandum and the Bill, and the only reference to confiscation is paragraph 2(6) of Schedule 8, which states:
“A constable may seize anything that the constable discovers in the course of a search under this paragraph if the constable has reasonable grounds for believing that it is evidence in relation to a relevant ANO offence or a relevant prison offence.”
That part of the Bill seems to relate only to condition C under paragraph 2(5) of Schedule 8; in other words, it relates only to prison intervention by drone-related offences. My Amendment 25 would add only a right of appeal for restoration of property. I am worried that I see no reference to confiscation under any other schedules to the Bill. I will concentrate my remarks on the links between Schedules 8 and 10, which is the subject of a later reference in this group—or it was until this morning, when I came here and found that there had been a regrouping.
Before doing so, I will make a few general comments. First, have we any estimate of the numbers of drones available for use in the United Kingdom, of all types, commercial and recreational? We have an estimate of 530,000 drone sales in 2014—that came out of one of these documents; I found it very hard to believe—and a further estimate of 1.5 million to be sold in subsequent years. Again, I do not know where this information comes from, but it is in one of the publications. Do the Government have any real stats on the availability of this equipment?
Secondly, I am not too worried about commercial operators. They will, generally speaking, keep within the rules and the law—although there is some evidence of the need for some commercial operators to be more knowledgeable, and for some airport operators to be more flexible and understanding about charging and issues of access, in particular regarding the size of restriction zones. My primary concern is the rogue operator, using sub-250-gram UAVs, and large equipment used privately by individuals, whether they are plain stupid in the way they use this equipment, or are drug dealers arranging for the carrying of drugs, crime gangs involved in illicit surveillance, potential terrorists who may wish to deploy weapons even in very small quantities or using small drones, or those who breach personal security where privacy is involved. Mr Geoffrey Hirst, a drone user, told a Commons committee recently that even a proportion of the recreational drone community are reckless, whether intentionally or not. We know that these small, sub-250-gram drones can be dangerous. When a joint test between the Military Aviation Authority and BALPA was recently undertaken, it was concluded that in a mid-air collision significant damage could be caused to a helicopter or aircraft.
I return to Schedule 10—subject to what has happened, but that was beyond my control. Under “Fixed penalties for certain offences relating to unmanned aircraft”, it states:
“The constable may give … a fixed penalty notice in respect of the fixed penalty offence if Condition A and Condition B are met.”
Condition A states that that includes: endangering another aircraft; causing any harm, harassment or alarm or distress; causing nuisance or annoyance; disturbing public order; or damaging property, all of which the accused could very easily deny. The only one that may be provable could be the undermining of good order in a prison, which is why we have the paragraph 2(6) of Schedule 8 confiscation provisions which I have already referred to. Nearly all the others can be denied by the accused, and it will be very hard for anyone to prove otherwise. If the police officer gives the offender within the zone the benefit of the doubt, the offender will receive only a fixed penalty notice. Furthermore, if the person is under 18, they will not even receive a fixed penalty notice—effectively, an open invitation for the adult offender to lay responsibility on minors to hide their guilt and penalty. In other words, “Not me guv, it was the kids that did it”. They will effectively run rings round the drone code, with its hyped registration, responsibility and distance control requirements.
I will speak to my probing Amendment 27 and to Amendment 30. I follow the strong words that we have just heard from the noble Lord.
As I and many others stressed on Second Reading, the risk of a mid-air collision involving unmanned aircraft being operated illegally is very serious and even catastrophic. It would be remiss not to reflect the seriousness of that danger in the punishment awarded. Indeed, it might also be worth considering whether some form of third-party compulsory insurance should be acquired by all operators of unmanned aircraft.
The misuse of an unarmed aircraft should be liable not only to a fine, or imprisonment if the misuse were to be catastrophic, but should invariably include the forfeiture of the unmanned aircraft and its associated kit for any misuse that falls outside a single instance of the fixed penalty range of misbehaviour. A deterrent to misuse before flight is of potentially greater value than just a monetary punishment as the result of an airborne offence. Even in the fixed penalty range of misuses, a persistent offender should face the risk of forfeiture, or at least confiscation for a period of time.
It is too early to delve deeply into the secondary legislation that will introduce the fixed penalty arrangements. However, as with fixed penalty notices for car drivers, is it intended that a points system will be set up, so that an individual who repeatedly offends and amasses a number of penalty points within a set time will then face the confiscation of their unmanned aircraft and associated kit, either completely or at least for a period of time as the consequence of their repeated misbehaviour? The deterrent value of such a scheme is well worth considering, although I recognise that the administrative details for it will need careful thought and could even be deemed excessive.
My Lords, I would like to speak in support of Amendment 25, and again, I had hoped to see my name attached to it. I am not sure whether the Committee fully appreciates the sheer scale and numbers that we are dealing with. My judgment, as someone who has been keeping some track of what is happening, is that probably 2 million drones have now been sold and presumably are being flown. I have had the privilege of serving on the Public Accounts Committee with the noble Lord opposite, and on a number of occasions he and I would probe into issues in depth. I therefore say to my noble friend on the Front Bench that the probing which the noble Lord has done should be listened to and assessed very carefully.
Yesterday I went to a briefing on the importation of illegal tobacco. I have never smoked so I have no real personal interest other than ensuring that the revenue that should legitimately go to Her Majesty’s Government does so. There is little doubt that the people behind the illegal importation of tobacco are incredibly creative and show enormous genius, with the result that huge quantities are coming into this country. Allied to that is illegal drug importation, to which the same applies. I have just renewed my shotgun licence. The police are exceedingly careful about the renewal of such licences, not least by those of the older generation, in which I put myself. I am not surprised about that. The police checked thoroughly into where the guns are kept and whether they are properly locked away, and that we had security arrangements to ensure that if someone did break in, alarms would be set off.
We are absolute beginners in this field of activity and its implications. My friend the noble Lord on the Benches opposite is right to say that we are dealing with the rogue element but—as I have demonstrated by giving just two examples in drugs and illegal tobacco importation, and there are others—the rogue element is there in great profusion. Moreover, drones themselves provide a wonderful facility for illegal importation activities. Even if my noble friend on the Front Bench is not able to accept the wording of the amendment, I hope that she will think about it seriously and possibly come back at Report either to accept it or to table it with some minor modifications.
I will say to noble Lords that if we do not take action at this point in time, we will rue the day.
The noble Lord, Lord Naseby, has made an interesting comparison between drug and tobacco smuggling and the action of a drone. The difference is that a drone can do monumental damage, if a rogue operator gets in the way and starts doing things that they should not be doing. I saw an instance of drug smuggling in the Isles of Scilly a few years ago; not only was the boat being used to smuggle confiscated, but the man who was single-handedly bringing these drugs into the country was so frightened of being caught that, when the yacht was tied up in St Mary’s harbour, he decided that the best way to get away was to climb the mast. He fell to his death on the quay, which was very sad. He was desperate not to get caught, but the boat would have been confiscated, and I cannot see why a drone cannot be confiscated.
My noble friend Lord Campbell-Savours gave some wonderful examples of the numbers involved. The drones should obviously be confiscated, and anyone who wants to get their equipment back should have to apply to a magistrate. The amendment seems very reasonable to me.
Is there any requirement for those who operate drones to ensure that they are fitted with transponders, which can be interrogated by other types of aircraft conducting their operations perfectly legally within the same airspace? Might some mechanism be found to ensure that those who operate drones without transponders are breaching the rules, to which the noble and gallant Lord and the noble Lord, Lord Berkeley, have referred?
This, again, is an aspect of the Bill where there is unanimity across all sides of the House—we are all trying to achieve the same purpose. The question is how best to do so, especially in an environment where technology is moving extremely fast. I am certainly sympathetic to the sentiments expressed by the noble Lord, Lord Campbell-Savours, and other Members of the Committee.
When the Minister comes to reply to this very interesting debate, perhaps she might describe the other sanctions that a rogue operator may be subject to in addition to the fixed penalties outlined in Schedule 10. We are talking about a broad variety of potential consequences, from annoying the neighbours on a sunny summer’s afternoon to deliberately trying to destroy an aircraft containing hundreds of passengers over central London. What sanctions could have faced the operator or the person in control—to use the phraseology of the noble and gallant Lord—who caused the disruption to Gatwick only a short while ago whose extremely irresponsible actions could have resulted in a high degree of disruption to the whole travel system of the United Kingdom?
It may be more convenient to discuss my second point in a later group of amendments, but there is a real issue around promulgation of the law. Because these devices can be bought over the internet and from shops by people who I suggest may not be familiar with the Air Navigation Order, they are probably not aware of the rules and how dangerous this activity can be and its consequences. I look forward to my noble friend’s response.
My Lords, I am eternally grateful for this thought-provoking debate on confiscation and forfeiture. A number of issues have been raised. I will endeavour to cover as many as I possibly can, but I am aware that a number of noble Lords have made some very thoughtful points, so I will go away and read Hansard to make sure I have covered everything. At times, some very good points that I think we can address were made. At other times, there may have been some slight misconceptions as to the different types of offences and penalties being placed on people.
Does that include all drones—commercial and recreational?
Yes, it includes all unmanned aircraft. Various bulk uploads will come from model aircraft clubs, so we expect that number to climb. Over the course of this Bill, perhaps when we get to Report, I am happy to look for an update on that and to give some indication of where we think more people registering their drones will come from.
Setting out the background to this, the noble Lord, Lord Campbell-Savours, mentioned a number of offences to which he assumed a fixed penalty notice could be attached. I believe they may not be given for those more serious offences to which he referred. Subsequent to this, I hope to be able to set out precisely what will be given to each level of offence, because there is perhaps a little confusion. I will go through my explanation, because there are opportunities for confiscation and forfeiture, which I hope will mean that the noble Lords are content to withdraw their amendments. Let us just see how we go.
Amendment 25 would give the police the power to confiscate an unmanned aircraft if a constable has required it to be grounded. Amendments 27 and 30 would require somebody to forfeit the unmanned aircraft as the penalty for unlawful use. I reiterate that my department has worked closely with the Home Office to ensure that the powers in this Bill are proportionate—that is an important word here—because we do not want to stifle a nascent, growing and potentially very useful drone industry. We do not want to discourage or alienate those who seek to use the unmanned aircraft sector lawfully, because it should be very useful as we go forward. We have also worked with the police, who are confident that they have the powers in this Bill to provide effective enforcement.
The amendment on confiscation, Amendment 25, would provide a potentially disproportionate power to the police, in addition to the existing powers in the Bill for them to require an unmanned aircraft, rather than an unmanned vehicle, to be grounded.
Why should a drone that goes into one of these restricted zones, which could potentially cause huge damage, not be confiscated?
If the noble Lord will bear with me, that drone would probably be confiscated by a constable for a different reason.
In our opinion, the amendment on forfeiture would also provide a potentially disproportionate penalty for those who commit most likely very minor offences of failing to ground an unmanned aircraft when asked to do so by police, or failing to comply with a constable’s request to inspect that small unmanned aircraft. While we feel that it would be disproportionate to insert these powers of confiscation and forfeiture regarding these two offences, it should be noted that the police have powers of confiscation elsewhere in the Bill and already in law.
Under the Bill, the police will have the power to stop and search a person or vehicle where they have reasonable grounds to suspect they will find an unmanned aircraft that is or has been involved in the commission of one of the offences specified in paragraph 2 of Schedule 8. This is for more serious offences, such as interfering with aircraft. This stop and search power gives the police constable the power to seize anything they discover in the course of a search if they have reasonable grounds to believe it is evidence relating to one of those offences.
The summary of all the stop and search offences was given out at the all-Peers meeting and I am very happy to send round this ready reckoner, which shows which offences fall under stop and search if there is suspicion of them. They are, for example, flying above 400 feet or within an exclusion zone of an airport. If there was a stop and search in that case, that item could be seized as evidence. Similarly, when entering and searching a premises under warrant using the powers in the Bill, a constable might seize an unmanned aircraft or any article associated with it if they have reason to believe it has been involved in the commission of one of the offences set out in paragraph 7 of Schedule 8.
The noble Baroness said the constable has the power to seize, but has he powers to retain and make forfeit, or would it be just a temporary seizure until such time as the courts had dealt with the circumstances? The point of my amendment, and I believe that of the noble Lord, Lord Campbell-Savours, is that of a deterrent for illegal use. Seizure or forfeiture would be a very good deterrent. As we mentioned earlier, we are dealing not with people who are behaving and who we are trying to encourage to grow their legal use of drones, but with people who might be or are operating them illegally. Those are the people I want to deter.
The noble and gallant Lord makes a very interesting and valid point about deterrence, which is probably quite separate from the line I sought to convince him of. Noble Lords have mentioned that a very good drone might cost, say, £500, but the penalties we are talking about for some of the offences that could have been committed are fines up to a maximum of £2,500.
If, indeed, they are paid, which I will come on to—perhaps in the letter—because there are some very significant deterrents. If we are after a deterrent, we have those deterrents. Do we feel it is proportionate for property to be forfeited for fairly minor contraventions? We do not.
I am sorry to interrupt again, but on a minor thing, as I said in my opening remarks a single misbehaviour under what would be a fixed penalty notice would not be a cause for forfeiture, but repeated misbehaviour that might individually be at the level of the fixed penalty notice should be taken into account. That is why I suggest that, under those circumstances, forfeiture, at least for a period if not completely, should be part of that penalty.
The noble Lord makes an interesting point. I suspect that in those circumstances, the person would just go out and buy another drone. We are between a rock and a hard place: drones are not so expensive that forfeiture is a huge issue, versus a fixed penalty notice, which may also be significant. We do not feel that forfeiture would make a significant difference to the deterrents. The penalties already in place are good ones. However, for the sake of completeness, I will mention that under current law, if a person has refused to ground their unmanned aircraft and has been arrested for an offence, the police officer has the power, under Section 32 of the Police and Criminal Evidence Act 1984, to search the arrested person and to seize anything that is evidence.
I understand where my noble friend is coming from, but what she perhaps does not fully comprehend is that to those of us who have been involved in this industry for years, this is a highly dangerous area—far worse than motorbikes. The Government have the opportunity to lay down clearly that anybody who transgresses will be hit hard. This does not affect the genuine operators, who will take great care. However, quite frankly, listening to my noble friend, I can see this being abused. I see drones every weekend where I live. Half of them, perhaps, are being flown correctly, but a significant proportion are not, hence the two in my shed that crashed in the last six months.
My noble friend makes a very interesting statement. This Government recognise that in certain circumstances, when drones are not being flown correctly, it is literally a life and death situation. This is why the penalty for the most significant offences—recklessly or negligently acting in a manner likely to endanger an aircraft or any person in an aircraft—is an unlimited fine or up to five years in prison.
My noble friend suggested that only half of those drones are flying within the rules. That is why we have introduced the competency and registration system. People are taking the competency test. If the Bill is passed and the police have the powers, they will be able to stand in my noble friend’s garden, identify those who may not be operating within the law and do something about it. Without the Bill, they could not.
I am aware that my noble friend supports the Bill, and I appreciate his support. The Government are just saying “proportionality”. The Government’s role is not to come down hard across the entire sector, but to be proportionate. Those guilty of a minor contravention will get a fixed penalty notice; for something more serious, it is up to five years’ imprisonment and an unlimited fine.
Turning to a couple of points I have not covered, my noble friend Lord Glenarthur made an important point about electronic conspicuity, or remote ID. This is being introduced into drones. Although it is not ubiquitous at the moment, electronic conspicuity for all aircraft was consulted on in the Aviation 2050 consultation. We will be looking at how we take that forward but, as part of EU retained law, the EU-delegated Act is already within domestic law. It contains remote identification requirements. This delegated Act came into force on 1 July 2019. We are currently in a transition period; within three years, electronic conspicuity and remote ID will be a requirement for all drones.
The Minister referred to consultation. Could she refresh my memory as to when that consultation took place, when it was completed and when the results were published?
I am afraid I do not have that information to hand. I would be remiss if I tried to remember, so I will write to the noble Baroness. I think that was a consultation for all aircraft. She will be aware that the Government are looking at general aviation and, as we move forward, the interplay between unmanned and manned aircraft in a unified traffic management system. That is some way off but we have to start thinking about it now. The electronic conspicuity of drones comes from EU regulation and is now in domestic law. We are in the three-year period during which all drones will have to have conspicuity.
My noble friend Lord Goschen mentioned other penalties and I hope I have given him some idea of their level. I will send this note around because it is useful in setting out exactly what happens if you contravene certain of the regulations.
As for getting people to understand what is required of them, we work with the retailers and the manufacturers—the CAA has the drone code—to make sure that we get the message out as much as possible. This is particularly important around Christmas, when there is a great deal of activity, so that when people get a drone—are given one or buy one—they know that it is not a responsibility-free activity and exactly what their rights and responsibilities are.
I feel a letter coming on on this one. There is quite a bit to cover about proportionality, deterrence and the different levels of penalty for different offences.
I am pleased that the noble Baroness will write a letter. It might be a long one, but that is good. In this debate we have swung between saying, “Most people are just doing it in the garden. They might have the drone under their bed. If they go up, they do not fly hard, it is not going high and it won’t hurt anyone much,” to the other extreme when it could bring down an aeroplane or worse. My noble friend and others commented on the number of drones that may be flying and wondered how many will be flying illegally—in other words, without notification, without a licence or whatever. The question of proportionality is therefore quite serious; for some offences confiscation may be too strong a penalty and for others nothing like enough. In her letter, will the Minister give us some idea of how many constables or whatever we are to call them—the enforcement agency—will be trained to do this work and how many offences might they have to follow up each year? I have not a clue. You can think of every policeman in the country being able to do this—which is stupid—or of it all being done centrally. However, it would be good to have some idea of how enforcement might take place so that people like me, who have no great experience of this, can compare it to what happens on the roads or anywhere else. I will be glad to hear the Minister’s comments.
I thank the noble Lord for that intervention. I hope he will be able to stick around until we get on to later amendments dealing with police resourcing and how the training will work.
Let me go back to first principles. The Bill is about giving the police the powers they need to put in place the penalties that already exist. It is very much about filling in that gap. We are working closely with the police and this is what they have asked us to do to give them the powers to clamp down on illegal drone use. The situation is in flux as people register but, for people who have not registered and are flying illegally, the police now have these powers. Without the Bill, they would not have the powers. With that, I hope the noble Lord will feel able to withdraw his amendment.
My Lords, will my noble friend please include me in the list of addressees for the important letter she is going to write?
I shall say just one or two words. The Minister has offered to write us a letter. It is not a letter we want. We want it in the law. The letter will interpret the law in a way that she believes will satisfy the concerns we have expressed. I am worried about the guy out there with a drone. He is not going to read the law. He wants very simple principles established that he can understand. In the light of the interpretation that the Minister has put on the law during the interventions, I do not understand the law, and the other day I spent more than an hour going through these clauses to try to work out what was applicable in what circumstances. I put it to the Minister that the law is badly drafted. I have never said that in this House before. It is badly drafted, and we need far greater clarity in the clauses that Parliament is required to clear.
I predict that in the Commons, when MPs with airports in their constituency get their hands on the Bill, they will rubbish this clause because they will be dissatisfied with the provisions as explained to us. I say to the civil servants now that they should think in advance, before the Bill gets to the Commons, about how they will deal with the objections that will inevitably arise.
The Minister says that the role of government is to be proportionate. I agree. However, a small drone of 250 grams within a restricted zone can bring down a jumbo jet, with hundreds of lives lost. I think I am being proportionate and the Government are not in not understanding that that is the danger we are considering. The Minister has laid words on the record today that, in the event of a disaster, people will pore over and wonder what the hell she was talking about. I shall no doubt come back to this on Report, but I beg leave to withdraw the amendment.
My Lords, there are three matters in this group: Amendments 28 and 29 and whether Schedule 8 should stand part of the Bill. I shall address Amendment 28. Paragraph 2 of Schedule 8 sets out the powers of a constable to stop and search persons or vehicles and includes the conditions that have be met in order to do so. This amendment would require the Secretary of State to publish details of the demographics of those who are stopped and searched. The purpose of the amendment is to find out what the Government intend in this regard.
The amendment refers to the Equality Act 2010 and the nine protected characteristics: age; disability; gender reassignment; marriage and civil partnership; pregnancy and maternity; race, religion or belief; sex; and sexual orientation. At Second Reading the Government said that stop and search demographics would be available for those subject to a stop and search under these powers, and that
“they will be published by the Home Office in the usual way.”—[Official Report, 27/1/20; col. 1295.]
What does “published by the Home Office in the usual way” mean in relation to this amendment and the nine protected characteristics under the Equality Act 2010?
How did the Government come to the decision to enact these stop and search powers under Schedule 8? In autumn 2018, the Home Office ran a public consultation on
“Stop and Search: Extending police powers to cover offences relating to unmanned aircraft … laser pointers and corrosive substances.”
The Government indicated in the Explanatory Notes to the Bill, if I have read them correctly, that responses to the consultation were broadly unsupportive of proposals relating to unmanned aircraft, with many respondents feeling that the intrusive nature of stop and search powers would be disproportionate to the likely threat.
Since that consultation, we have had the incident at Gatwick Airport at the end of 2018. Following that incident, in response to the consultation, the Home Office committed itself to developing a stop and search power for offences related to flying an unmanned aircraft in the flight restriction zone of a protected aerodrome. The Home Office also indicated its intention to keep the further expansion of stop and search powers in relation to other unmanned aircraft offences under review.
The Bill now provides the police with the power to stop and search any person or vehicle, subject to certain conditions. At Second Reading, in response to the point that the Home Office consultation was completed before the Gatwick incident, the Minister said:
“I reassure noble Lords that we have of course been in touch with members of the police force around Gatwick and, indeed, all over the country to make sure they are content with the powers in the Bill. We believe that they are. We have a close relationship with them, so they have been involved since Gatwick in making sure these powers are appropriate. Of course, we still meet with the police and other stakeholders to discuss these matters in general.”—[Official Report, 27/1/20; cols. 1291-92.]
Bearing in mind that, in the public consultation prior to Gatwick, responses were broadly unsupportive of proposals on stop and search powers in relation to unmanned aircraft, were any meetings or other forms of contact had with those who had been broadly unsupportive of the proposals, to check that their views had changed since the Gatwick Airport incident? Did the Government make an assumption that views would have changed, or did they not intend anyway to take any notice of the broadly unsupportive responses to the stop and search proposals, so that it did not really matter whether views had changed as a result of the Gatwick incident? A government response on this would be helpful.
The second item in this group relates to Schedule 8 standing part. I want to talk about paragraph 5(11) of Schedule 8, which inserts a power at new subsection (4B) into Section 93 of the Police Act 1997. This enables the Secretary of State, by regulations, to add or remove an offence from the definition of “relevant offence” set out in subsection (4A), also inserted by this Bill. Paragraph 5 of Schedule 8 deals with
“authorisations to interfere with property”
or interference with wireless telegraphy. This is a Henry VIII power. In their memorandum to the Delegated Powers and Regulatory Reform Committee, the Government said that it was necessary to ensure that the list of relevant offences remained up to date
“if the evolution of technology results in unmanned aircraft being used in new or different types of offence.”
I note that they used the word “if”, not “as”, in relation to the evolution of technology; clearly the Government do not actually know whether they will need this power to add, by regulations, additional or even completely new offences.
In the same memorandum, the Government say:
“The power to interfere with property or wireless telegraphy is a significant power which entails the possibility of interferences with, for example, people’s property rights.”
Further on, the Government refer to
“any expansion of the power to interfere with property and wireless telegraphy”.
Yet the Government want to have this “significant power” and this “expansion of the power to interfere” with “people’s property rights” by adding additional new offences that they do not know they will need and appear unable to describe, and to do so not by primary legislation but by regulations that cannot be amended.
My Lords, I support the noble Lord’s comments, particularly in relation to Amendments 28 and 29. Our experience of the use of stop and search powers over the years has revealed that the police have to perform a very careful balancing act in their use of those powers. The idea of ensuring that they are looked at carefully after a period of time would therefore certainly assist in avoiding the misuse of powers.
This is particularly complex because the leisure use of drones is about a lot more than a group of people standing in a field and having a little fun. There are a lot of brilliant commercial uses of drones, along with some very important uses by the military and in our emergency services generally. But there is a complex, unofficial use of drones nowadays and it is not all innocent fun. They are widely used in the drugs trade. It is therefore important that the use of stop and search powers is exercised with a view to looking at potential criminality, beyond whether a drone is being used in the wrong place or flown too high and so on. However, that has to be done proportionately and carefully. Our experience over many years in this country is that there is nothing quite like a little transparency in the way in which a power is exercised, to ensure that it is done properly and fairly.
I support Amendment 29, too, because of the obvious fact that the Prison Service is greatly overstretched. It can be argued logically that if you used these resources to control the misuse of drugs in prisons, you would actually make the life of the Prison Service rather easier. Unfortunately, when a service of any kind—we have had this all the time with the NHS—is as badly stretched as the Prison Service, it has a hand-to-mouth existence. It is very important that the impact of this additional responsibility is looked at carefully in the months following the introduction of these powers.
We will investigate a lot of other issues in debating the next group of amendments, which emphasise the complexity of the situation now with drugs. However, the two amendments in this group draw out two important threads.
I thank the noble Lord, Lord Rosser, for introducing this group of amendments, which gives us the opportunity to discuss the stop and search powers and the resourcing of police, and to dip our first toe in the water on delegated powers.
We recognise that stop and search is a significant power and that it is essential that we use it appropriately and proportionately. The noble Lord, Lord Rosser, rightly recognised that the consultation on the use of stop and search for drones reported before Gatwick. Therefore, the powers in this Bill were included as a result of a significant amount of consultation after Gatwick to make sure that we got it right. Since that consultation concluded, officials have had various meetings with stakeholders to discuss the consultation response both within and outside government. Those consulted include the Ministry of Defence, the Ministry of Justice and BEIS, as well as the National Police Chiefs’ Council and CT Policing. The Department for Transport has also met groups such as BALPA and the Guild of Air Traffic Control Officers, who in general support the police powers proposed in the Bill.
It is important that the powers be used only where proportionate, so there are a number of limits in the Bill. In the first instance, a constable must have grounds for suspecting that they will find an unmanned aircraft or something associated with an unmanned aircraft, such as a controller, and that the unmanned aircraft or article has been involved in the commission of one of the offences specified in the Bill. I shall send the schedule to noble Lords.
The Minister referred to BALPA. Is she saying that BALPA has expressed no reservations whatever about the police powers?
I am not aware that BALPA has any reservations about the stop and search powers under discussion.
I am afraid that I cannot recall exactly what BALPA’s reservations are—whether it has reservations about other police powers—but it was certainly one of the stakeholders that we spoke to regarding stop and search. As a consequence of the conversations that we had, we believe that introducing the powers in this Bill is proportionate and appropriate.
The more serious offences that could be liable to stop and search go towards the higher end of the penalty range and might involve transferring articles into or out of prisons et cetera. The Bill also sets out further conditions that need to be met. For offences that could be considered less serious, the conditions are more stringent. For example, in relation to Article 95 of the Air Navigation Order, flying a small unmanned surveillance aircraft too close to people, or Article 239(4), flying within a prohibited area, where it is more likely that somebody has committed an offence unintentionally —which again goes back to proportionality—stop and search can be used only where there are reasonable grounds to suspect that the commission of an offence using an unmanned aircraft or associated article was for one of the following purposes: endangering an aircraft, which I think noble Lords would all agree should be top of the list; causing any person harm, harassment, alarm or distress; undermining security, good order or discipline in any prison or institution where people are lawfully detained; damaging property, or threatening national security. So, there are many offences where stop and search does not apply—for example, Article 94, including flying beyond visual line of sight without permission and flying commercially without permission. Here, stop and search would not be applicable.
We also recognise that it is very important to minimise the potential for discrimination in the exercise of police powers. In addition to the limitations written into the Bill, the conduct and recording of the Bill powers will be subject to Sections 2 and 3 of PACE, for which there is already guidance for police in Code A, the code of practice for police in the exercise of statutory powers of stop and search. Code A will apply to the Bill powers to ensure that they are being exercised appropriately.
I thank the Minister for her response to the amendments on which I and others have spoken. I will of course withdraw my amendment, but am not entirely convinced on her point about police resources. I asked some fairly specific questions about the percentage of police officers who would be required to have the training; I still do not know whether it is envisaged that all police officers will have this knowledge or whether it will be a much smaller grouping. I also asked about the tactical advisers; I suspect on that one it will be a case of waiting to see what happens—whether the Government’s view of the extent to which it will involve an additional responsibility or duty on the police materialises or whether it will prove somewhat greater than the Government anticipated. But for now I beg leave to withdraw my amendment.
I have written down an item about Schedule 10 and I will speak in particular to paragraph 6 of Schedule 10, which allows for supplementary provision to be made by regulations with respect to fixed penalty notices, including to the extent of amending or repealing provisions by an Act of Parliament. Paragraph 6(1)(b) of Schedule 10 also states that the Secretary of State may by regulations make
“provision about the consequences of providing false statements in connection with fixed penalty notices, including provision creating criminal offences.”
In their memorandum to the Delegated Powers and Regulatory Reform Committee, the Government’s justification for this power to create criminal offences through regulations, which cannot be subject to amendment, appears to be at least in part that there is a precedent in Section 54 of the Space Industry Act 2018. That Act was in essence a skeleton Act, which the Government told us was needed on the statute book before it could be properly fleshed out—hence so much being left to subsequent regulations—to provide assurance or comfort to the emerging UK space industry that the Government were prepared to give it the legislative backing and certainty it required. I suggest that the same consideration hardly applies here in relation to fixed penalty notices and the creation of criminal offences.
The Government say that the powers in paragraph 6(1)(b) to create criminal offences are needed to ensure that provision can be made for the consequences of providing full statements in connection with fixed penalty notices. But what kind of criminal offences are we talking about which are apparently so unique that the Government cannot formulate them now and put them in the Bill? Alternatively, since the Government refer only to the
“possibility of creating criminal offences in relation to false statements,”
why not first determine what those new criminal offences are that need creating and then include them in the next suitable Bill, where they can be fully debated and amended?
The Government clearly regard this Henry VIII power to be of some significance, since they state in their memorandum to the DPRR Committee that
“the regulations may create criminal offences and make provision about the process around appeals, and there is therefore the potential for significant impact to the public, police and judicial system.”
However, despite that potential for significant impact, the Government think it appropriate to use Henry VIII powers and regulations rather than primary legislation, which is invariably more fully debated and which, unlike regulations, can be amended. So can the Government give a somewhat fuller explanation of why having the powers to which I have referred in Schedule 10 is so crucial and, in their view, unavoidable, as opposed to them being powers, frankly, of administrative convenience to the Government?
I thank the noble Lord, Lord Rosser, for introducing a specific part of Schedule 10: notably, paragraph 6, which gives the Secretary of State the power to make regulations for the provisions about fixed penalty notices, the form of and the information included, and the consequences of providing false statements in connection with fixed penalty notices, including the provision of creating criminal offences, as the noble Lord noted. It is important to note that within all this there is the affirmative resolution, and the consequences need to be proportionate and appropriate to the fixed penalty notices themselves. So proportionality will certainly come into this.
Should the regulations be used in future, the key consideration will be whether they are proportionate. The noble Lord mentioned that the consequences could be put in other legislation, but there could be no other suitable legislation coming down the track. As he noted, there is precedent for making regulations in the manner set out in the Bill. This would be a perfectly reasonable way to provide the flexibility that the Government need in this area as the entire sector develops. We need the flexibility not only for the information required in fixed penalty notices; it must therefore be the case that the consequences of providing false statements in relation to fixed penalty notices must also be needed. That is why we have taken this power in the Bill.
I hope that, with that explanation, the noble Lord will feel able not to oppose the schedule.
My Lord, when I became Lords Transport spokesman in 2015, the first major piece of work I participated in related to drones. Work had already been done on that by one of the European Union sub-committees and a good report published. Then, and ever since, I have urged the Government to grasp this issue. Despite many opportunities, they have refused stubbornly to do so. They have refused to be hurried. Above all, they have refused to look ahead at rapidly developing technology.
Since 2015, a range of Ministers has been sitting opposite us answering on transport issues, but from one after another we have heard the phrase, “We lead the world in drone technology.” They have failed to grasp that if you are going to lead the world in the technology, you need to lead it in its regulation too. In preparation for today, I looked again at briefings we had a couple of years ago on legislation on drones. Then, a couple of weeks ago, we received a briefing from DJI, a leading UK drone manufacturer, which specified what its drones can now do. I compared that with what we were told drones could do a couple of years ago. In that short period of time, there has been a leap in technological capability. Here we have a Bill to update the law, yet the government response is limited to falling back on a few long-established police powers.
I cannot emphasise enough that that is a huge missed legislative opportunity. The Government should be looking at what drones can do now and indeed be anticipating what they will be able to do in a few months’ time, not even in a few years’ time, because it takes that long to get legislation on to the statute book and in that time there will be another step forward in drone technology. I argue that we owe it to pilots and passengers, whose safety is at risk. We owe it to airport operators who, at great cost, have to deal with the threats from drones, and we owe it to drone manufacturers and users to provide the framework for safe drone usage. I take issue with what the Minister said earlier about being proportionate, not overreacting and so on. Rather, drones need a good reputation. To achieve that, they need a good, modern and strong legal framework, which this Bill does not provide. Nothing could be worse for the drone manufacturing industry and for our technological base in it than to suffer disasters associated with drones which happened as a result of the fact that we have inadequate legislation.
Modern, adequate legislation does not have to be draconian, it just has to look at the ways in which drones operate and to take them into account. Amendment 31 is designed to open up the discussion and to encourage the Minister to go back to her department and press for firm measures to be incorporated in the Bill on Report. We are asking for a review, which is the very least that is needed. I would prefer some action now. I would like a much tighter legal framework, but to help the Minister I have specified some of the key issues that those in the industry— whether BALPA representing pilots, those in the drone manufacturing industry or those in the aviation industry—believe need to be addressed urgently.
For example, a recent opinion poll showed that 60% of people are concerned about the privacy implications of drones. Earlier, the noble Viscount referred to the issue of drones being flown over gardens, and there are other issues associated with the use of drones being used to spy on neighbours in a very unpleasant manner. Is the current legislation comprehensive enough to deal with the invasion of privacy implications of drone use? I doubt it.
The issue of the minimum age also needs to be addressed. In the wrong hands, a drone can bring down a plane, so it is only sensible to set a minimum age for flying them. They are not children’s toys, although they are often bought as such by badly informed parents. Last Christmas I noticed that one or two retailers stated that they were ceasing to sell drones because they realised the level of responsibility that goes with them.
The technology now exists for the remote identification of drones, something the Minister referred to earlier, but setting that aside, as some would have it and some would not, all should now have remote identification. It is reasonable to expect that it should always be switched on. It was explained to me that it should work like registering a car. I am registered as a driver and my car is registered as my property. If I drive badly, the police can take note of that, take the number plate, trace the car to me and rightly approach me to ask whether I was driving that car on that day and, if not, who was.
The same principle should apply with drones. Remote identification is an inexpensive way for the police and airport authorities to monitor drone usage. If a drone is flying too low or too close to an airfield and it has remote ID, the authorities can identify who owns it, find the owner and stop it flying there. If the drone’s ID is switched off, they know immediately that the incident is much more serious. They know that it is not a case of a youngster, or even a middle-aged person, behaving carelessly, but someone is deliberately intending to avoid being caught, leading to a potentially serious incident.
It should be an offence to switch off the remote identification of drones. There must of course be exceptions, which should be allowed as part of a regular process by the CAA. There are organisations and people who have very good reasons not to obey this identification process. Obviously, it should also be an offence to modify or to weaponise—that is, to arm—a drone. I do not know whether the current legislation would cover that. It was put to me that it would not.
Geofencing also needs to be widely rolled out. That would involve updating drone software regularly. It could be done with the annual registration process, just as with an electric or an automated vehicle in years to come, when software will need to be regularly updated. It also needs to be done for drones.
I have been talking about airports but all of this applies to prison authorities as well. If it were to be applied to drones through legislation at this time, it would help prison authorities considerably, as well as assisting in the safety of airports. I beg to move.
My Lords, the noble Baroness’s words were tempting in some ways because ever since drones first appeared, we have been way behind the game in dealing with their potential dangers. They should never have been made available for the general public to buy and should have required a licence from the very beginning. All those things should have been done early on. So there is a temptation to support the noble Baroness in what she said, but when you think about it a little more, you realise that if we legislated in the way she asks for we would almost certainly be behind the game again. It is better that we leave things as they are drafted in the Bill so that we can take action much more easily in those circumstances when we see what is happening.
We cannot go back and undo those mistakes made at the very beginning because most people thought that they were toys. I remember that I in particular warned against the dangers of them being used, for example, as weapons launched at this building from a boat going up or down the river outside this place on to the Terraces, where people sit outside. That danger is still there. We need, all the time, to make sure that our powers are as flexible as possible—in the Bill now, I believe that they are—to deal with those threats as they arise.
My Lords, I only wish that the noble Lord, Lord Tebbit, had been here during earlier proceedings on the Bill because we dealt with the issues that he referred to.
I wonder whether Ministers have considered the 22nd report of the Commons Select Committee, entitled Commercial and Recreational Drone Use in the UK, and its recommendation. I want to read that recommendation out because it is at the heart of the amendment moved by the noble Baroness from the Liberal Democrat Benches. The committee said that they are
“concerned that there are differing accounts within the aviation community about the likely severity of damage of a drone collision with an airplane. Furthermore, there are differing accounts of the number of near misses and the reliability of airprox reports has been disputed. The Committee is concerned that there is no agreed position on the likely consequences of a drone-airplane impact. The Government should complete a substantive risk assessment”—
exactly what the noble Baroness said—
“by the end of 2020.”
That is the end of this year. The report went on:
“If it is not possible to publish the result of this assessment due to security concerns, the Government must provide this Committee with evidential assurances that this work has been done.”
Well, it has not been done. The Select Committee recommendation has been ignored.
To go back further in the committee’s evidence, the CAA said that
“It is considered unlikely that a small drone would cause significant damage to a modern turbo-fan jet engine”.
I am sure that the noble Lord, Lord Tebbit, will be interested in what the report then states because he was a BALPA airline pilot, if I recall correctly:
“Captain Tim Pottage, representing BALPA, voiced caution about the CAA’s position. Captain Pottage said that he was … ‘Concerned that the CAA had that view. There has been no testing of a drone against a large commercial high bypass jet engine—none at all. Anecdotal evidence suggests that it would cause a catastrophic failure, causing a blade to shed and not to be contained within the engine cell.’”
That is what is worrying us in the House. We will have a lot of people telling us not to worry about it and that it will not happen, but if it does happen, who will be held to blame? I believe that it will be this Government.
My Lords, the House should thank the noble Baroness, Lady Randerson, for introducing her amendment and enabling a discussion about, essentially, attempting to future-proof this legislation, which is extremely difficult to do.
I am afraid that I follow my noble friend Lord Tebbit’s analysis of the situation. We have to draw the line somewhere. It is important to move ahead with the legislation more or less as drafted—that is, as it appears before the Committee. It is difficult to legislate for future technical solutions, such as geofencing and reliable, low-cost, low-weight but high-power transponders that would have to be developed to be included in every single drone. Lightweight transponders exist at the moment—light enough to be put into gliders, for example—but they have relatively high power requirements. There is also the requirement for them to have very high integrity. If these drones are carrying a transponder and giving false information because the transponder costs £5, for example, air traffic control could be disrupted perhaps worse than by the original offence relating to where the device is being flown.
While I welcome the debate that the noble Baroness has facilitated through her amendment, I am sympathetic with my noble friend the Minister in trying to produce legislation that, as far as technologically we can, tackles the situation as it prevails at the moment while attempting to future-proof—often through the use of Henry VIII powers, which was the subject of the previous debate on Schedule 10. We need that flexibility. Some compromise is required to achieve that, and I suggest that that compromise is the use of delegated powers. It seems entirely clear that we will have to revisit this in the not too distant future, even after this Bill becomes law.
My Lords, I too am most grateful to the noble Baroness for introducing this amendment. Even taking on board the reservations that two of my noble friends on this side have expressed, proposed new subsection (1)—a continual review each six months—certainly ought to be incorporated somewhere in this Bill. I do not know whether this is the right place, but that is for the Minister—not to respond to tonight, but certainly to take on board and come back to us on Report.
I see absolutely nothing wrong in having a minimum age. For heaven’s sake, it was done for motorcycles and other vehicles on the highway, and this is no different—it just happens to be in the air—so it seems absolutely right to have a minimum age.
I have worked with my noble friend on the Opposition Benches on many things. Having flown light aircraft in Pakistan and Canada and in the Royal Air Force, I am deeply worried that something will happen. I see a responsibility to say to my noble friend on the Front Bench, who I do not think has had the privilege of doing either of those things, that there needs to be forestalling of a potential huge accident. I very much hope that the department takes that on board in this legislation.
My Lords, I am not without sympathy for the thoughts behind the amendment proposed by the noble Baroness, but there are some important complications, which were referred to by my noble friends Lord Tebbit and Lord Goschen. For example, electronic identification for each and every drone would be a considerable undertaking. It may in the end prove necessary, but it is not straightforward.
My Lords, unnecessary conflict has developed in this debate. I declare that I am the vice-president of BALPA, whose position, broadly speaking, is to support this Bill as far as it goes and strengthen it where we can, but also to recognise that there will be subsequent information and knowledge, and that regulation will be required as the impact of the technology changes. The noble Baroness’s amendment—building into this legislation the fact that we continuously review the specifics that she outlines, but also any other changes in technology—is the most sensible way to do it. We are not going to complete in the next few days a Bill that will last very long in its totality.
The noble Baroness, Lady Randerson, with whom I sat on the same committee, knows that five years ago the technology was very different. Some of the concerns were the same; some have been overcome. Hopefully, we can develop a situation in which we have a continuous review, but the request that that should be built into this Bill does not seem to me unreasonable. For the reasons that the noble Lord, Lord Naseby, and my noble friend Lord Campbell-Savours spelled out, and as I spoke about at Second Reading, we already know about the lack of testing on the effect of drones going into jet engines. We need that testing before we can effectively legislate. It is a potentially serious issue. We need a next stage built into the legislation. If the noble Baroness’s amendment is not accepted in total, I hope that its spirit will be taken on board by the Government.
My Amendment 35 in this group is on much the same theme as the amendment moved by the noble Baroness, Lady Randerson, except that it calls for the Secretary of State to,
“prepare a strategy for reviewing legislation relating to unmanned aircraft.”
At Second Reading, my noble friend Lord Tunnicliffe, referring to the rate at which technology surrounding drones has developed, said:
“It is possible that this legislation already falls behind recent developments. It seems to ignore the dangers that could arise from drones that fly beyond lines of sight. Ultimately, this legislation must be prepared to deal with the drone technology of the future”.—[Official Report, 27/1/20; col. 1270.]
My noble friend Lord Whitty referred at Second Reading to the Select Committee report from 2015 on drones—or, as I think they were known then, remotely piloted aircraft systems—and said that a range of issues raised in the report had “not been fully addressed” and were not really addressed in the Bill. Some related to the safety of other users in the air and on the ground, but there were also issues of insurance, licensing, privacy and liability and the question of how far the multiple operation of drones by one programme and one operator is compatible with our current regulations. He also spoke about changes in the air traffic control regulations to ensure adequate separation; strengthening the enforcement and checking system; removing built-in safety features from drones; the deliberate weaponisation of drones; and licensing of individual machines. The Airport Operators Association has called for mandatory geofencing software in drones and the mandatory identification of drones to help airports to identify genuine threats to safety.
I am sure that the Government recognise the need to keep reviewing legislation relating to unmanned aircraft. The incident at Gatwick Airport in December 2018 and other incidents and the subsequent emergence of the Bill suggest that someone or somebody had not kept their eye fully on the ball regarding the relevance of legislation by ensuring legislation continues to reflect current realities and technological developments. It is not unreasonable to suggest that a strategy should be drawn up for reviewing legislation to ensure that that does not happen again. At Second Reading, the Minister, speaking for the Government, said:
“Of course, the world of drones and airspace change never stops, so we will continue to review the legislation to ensure it remains fit for purpose, particularly for drones.”—[Official Report, 27/1/20; col. 1292.]
As I said, I am not sure that that has been the case in the light of the Gatwick incident in the sense of updating the legislation in time.
Will the Government’s strategy for reviewing legislation relating to unmanned aircraft be conducted in a piecemeal manner, responding to problems and issues as they come to light, or will we have a comprehensive review of all aspects of legislation relating to unmanned aircraft, as some have called for? The Airport Operators Association says in its briefing—which I am sure a number of noble Lords will have received—on Part 3 of the Bill on unmanned aircraft: “We are, in addition, disappointed that the Government have not taken the opportunity to include other elements called for by the majority of the industry and achieve one comprehensive piece of legislation on drone safety and usage.”
The piecemeal approach would appear to be in vogue at the moment. Even with this Bill, the Government have taken the line—and it has been repeated today—that this is about police enforcement powers and that, in their view, it is inappropriate to use this Bill for further unmanned aircraft regulation. There are also the Henry VIII powers in the Bill, which we have discussed. They provide for the creation of new offences by the Secretary of State, by regulation on an ad hoc basis, and for the addition of offences by the Secretary of State by regulations on an ad hoc basis. That again suggests a piecemeal approach by the Government to their continuous review of the legislation on unmanned aircraft to ensure that it remains fit for purpose. If legislation affecting unmanned aircraft is reviewed on a piecemeal basis, then when a problem or deficiency is exposed, we risk the equivalent of a second Gatwick incident.
This amendment calls for a strategy for reviewing legislation relating to unmanned aircraft—a strategy which, based on the evidence, frankly, is needed—and for that strategy to be prepared by the Secretary of State. I await the Government’s response.
My Lords, I thank all noble Lords who have contributed to what has been a very interesting debate. It has been more wide-ranging than I anticipated.
The Government are listening to everybody contributing to this debate—within this Chamber and beyond—about what they should be doing. Something needs to be done, but there is no silver bullet. Standing here now, I can absolutely say that there is no magic bullet, no single solution. We cannot legislate our way out of the issue facing us unless we completely ban drones. There was mention that perhaps we should have had a registration system at the outset, but we have had model aircraft for years. They have not had anything, and they too have been involved in incursions over airports. We cannot lull ourselves into a false sense of security. We cannot say that the Government are not doing enough, that something must be done and that this is all so terrible, because what in this Bill would have prevented Gatwick, for example?
Potentially, a transponder, but we knew where the drones were. We could see them flashing above the runway. What could we do about it? All the legislation in the world could not have done anything about that. It comes down to technology, and the work that we are doing with the CPNI to develop the counter-UAV technology. That is what we need to spend money on, and we intend to. The legislation before us is a series of things that have already been put in place under the air navigation order. The noble Baroness may criticise the approach as piecemeal, but essentially, it is keeping up with technology.
Does the Minister accept that Gatwick was an outlier in a range of events, and that it would have been caught by noticing that, “They’ve switched off their electronic ID, so we have a real problem here”? That would not have caught the drones but it would have alerted the authorities. Does she accept that most of these potentially dangerous incursions are accidental or careless, and that having some form of compulsory electronic ID would enable the authorities to act quickly and easily? We are not talking about new technology that is way over the horizon. It is here now.
The noble Baroness makes a couple of very interesting points, including that in many cases, people do not intend to commit these offences and if given a slap on the wrist and a fixed-penalty notice, they probably would not do it again. When the noble Baroness asked if I wanted to make an intervention, I was listening intently because I want to hear ideas about what we should be doing that we have not done already, and where the deficiencies are.
Let me address some of the ideas of noble Lords; others we will take away and look at further. My noble friend Lord Naseby said that there must be a minimum age. There is a minimum age: you must be over 18 to operate a drone. You must also pass a competency test to be a remote pilot, but the operator of the drone is the person responsible. I think we can agree that the minimum age issue has been dealt with.
On remote ID and electronic conspicuity, the delegated Act is in UK law. The noble Baroness suggested demanding that every drone has electronic conspicuity. We do not want to favour one drone manufacturer over another. We want to ensure that the technology we receive can develop naturally. It was agreed among EU members that a three-year transition period would be appropriate, but electronic conspicuity is in British law. It will be coming over the transition period, as we agreed with our colleagues in the EU.
The noble Baroness also asked why the process is not like car registration. It already is. One must register a drone, and it has a number on it, like a car number plate. So we already have registration and competency testing; these things are already part of UK law. I am therefore still looking for what it is we should be doing better. Geo-awareness and geo-fencing, like electronic conspicuity, are in the EU delegated Act, so they are in UK law.
Forgive me—I cannot recall which noble Lord mentioned BVLOS, but we already have drones that can fly beyond the visual line of sight. It is illegal to do so; that is already within our legislation. It cannot be done without permission.
I am slightly at a loss as to where we can take this further. Noble Lords mentioned areas that stray into other parts of the law, but on privacy, for example, which the Government take extremely seriously, we want to stop invasions of people’s privacy, but we consider the existing legislation sufficient. Article 95 of the air navigation order specifies that equipment must not be flown over or within 150 metres of a congested areas or an organised open area assembly of more than 1,000 people, within 50 metres of any third person, or within 30 metres during take-off and landing. The 50-metre limit also applies to structures, including houses. Capturing an image from over 50 metres away is possible, I suppose, but then the GDPR regulations and the Data Protection Act come in to protect people’s privacy. Other criminal legislation which noble Lords considered more recently around voyeurism includes the Sexual Offences Act 2003. So, there is existing legislation which protects privacy. Again, I am happy to listen to opinions on where the legislation is deficient and how it specifically relates to drones, rather than just general privacy information.
Perhaps I can answer the Minister’s question. She asked what can be done. Very simply—if she has listened to the debate she will know—confiscate any drone that enters one of these zones.
I am aware that that is the noble Lord’s position, but I am not sure that evidence exists that if confiscation becomes part of the Bill, it automatically means that nothing bad will ever happen to drones—or that it will make any difference at all—given that the penalties are already far higher than the cost of a drone.
I come back to the point that the purpose is its deterrent value. It would also have a public relations value. Rather than telling the owner of a drone that he or she may not fly it in a particular way, confiscation would have a deterrent value. This would encourage good behaviour and be a public relations exercise to show that the Government are taking seriously the possibility of a catastrophic accident if a drone were to hit a civilian airliner.
I agree with the noble and gallant Lord. The Government obviously take seriously the potential of a catastrophic accident. For those kinds of offences, the deterrent is far greater than having one’s drone taken away: it is a lengthy prison sentence and an unlimited fine. I remain unconvinced at this time that the confiscation or forfeiture of a drone is an additional means of deterrent.
I am trying to think of an example of an item being forfeited purely to provide that kind of deterrent effect. I will ask my officials to look at the issue and perhaps that will produce more convincing evidence.
One can think of the example of the seizure and destruction of untaxed vehicles by public authorities. The specific deterrent is the loss of the vehicle in addition to any financial penalty.
I thank my noble friend for that good example. I am not against this; I just wonder what the evidence is. I shall ask my officials to look for more examples and to see whether it is likely to be proportionate and a deterrent, and whether the existing penalty system is sufficient to deter not only minor offences but the most serious.
The noble Viscount referred to seizure as against confiscation. Perhaps we should simply substitute confiscation for seizure.
Perhaps I may be of help. It was pointed out to me that if I did not re-licence my shotgun within the statutory time limit I was given, the gun would be taken away from my premises. I do not know whether that would be for ever, but it would certainly be taken away for a long time.
I thank my noble friend for his additional data, to be added to the information I will be collecting before too long.
It is a sobering thought that, as I understand it, the Government have said that no legislation could prevent what happened at Gatwick happening again or even reduce its likelihood. That seems to be the Government’s stance. I apologise for my ignorance in advance, but can the Minister confirm that there is a report into the incident at Gatwick Airport in December 2018, and can it be made available?
What I said about Gatwick is that there is no silver bullet; there was not one piece of legislation that would have stopped Gatwick.
As a result of what happened at Gatwick, steps have been taken. So, it is not a case of just legislation stopping or not stopping it. Additional measures have been taken which make it less likely that the problems at Gatwick will arise again. At least, I hope that is the intention of the steps that have been taken.
The noble and gallant Lord is right; a number of steps have been taken. On the legislative side, we have looked carefully at what we can include. One of the steps taken as a result of Gatwick is that we asked CPNI to step up its work on counter-UAV technology and it has been carrying out tests. It did a call-out to industry; industry sent it whatever it had in detect, track and identify technology; and CPNI has been methodically working its way through it to see whether the technology works. Some of it does not.
We are looking carefully at providing a catalogue for airports to say to them, “This is the technology that works. We at CPNI, since Gatwick, have checked this technology and it works.” Those are the kinds of things we have been doing.
Looking at what would make us safer, when the Minister has had the opportunity to read the record, will she write to us to clarify the position? I believe she said to us categorically that you have to be 18 to operate a drone. The CAA has pages and pages about how to register as the flyer of a drone if you are under 13. An operator of a drone has to be 18-plus, but it is quite clear that an operator of a drone is not a flyer. The CAA states that you are an operator if
“you’re the adult responsible for an under 18 who owns a drone”—
under-18s cannot just fly a drone or a model aircraft, they can own them too—
“you’re responsible for a drone that someone else will fly”
or
“you already have a flyer ID, or an exemption, and you only need an operator ID at the moment.”
It is very lax. The point I am making is that there are things the Government can do—with all due respect, my amendment asks only for a review—without breaking new ground. The idea of registration is pretty straightforward and well established in other situations.
The noble Baroness, Lady Randerson, has just repeated back to me what I have already said. There is a registration system. It is in existence and it is very straightforward. There are two types of people who can use the registration system. The first is a person who is over 18 and is the operator of the drone. That person is responsible. The second person might be, but does not have to be, a remote pilot. Why did we do this? Why does the remote pilot thing exist? It is to make sure that people aged under 18 can fly drones. How are we going to get our young people interested in aviation and in flying model aircraft? This is not just about drones.
Sometimes I am very struck. The Liberal Democrats sometimes come across as being very illiberal and on other points they come across as being very liberal indeed. I am slightly confused because the noble Baroness has literally just said back to me what I said to her earlier: that is already in place. The operator of a drone is the person who is responsible for it. That person has to register that drone, just like a car, with the CAA. I do not want to stop young people who are competent. Every young person has to take the test. I took the test; they have to take the test. At that point, they can fly a drone.
I do not want to prolong the discussion today, but perhaps afterwards the noble Baroness will describe to me exactly what she thinks is missing from that system, because it comes from the EU regulations. I believe the Liberal Democrats like the EU. Those are the EU regulations. They are agreed with the EU and therefore they are consistent across Europe. They make sure that there is responsibility for the drone and that young people can fly if somebody else is responsible. The noble Baroness shakes her head and says no, but I really do not want to detain the Committee any longer on something which is not wholly relevant to this amendment. We can perhaps discuss it in later groups.
I believe that I have gone into some of the details, and I hope I have been able to demonstrate that we are listening. We want to hear about what specifically we can do to make things better. The noble Baroness mentioned DJI. We, too, have been in touch with DJI and I believe it has sent a briefing to several noble Lords. It is very clear that the Bill should remain a means of ensuring safety and compliance with existing regulation because that regulation includes the EU’s implementing and delegated regulations, which UK officials helped shape. These have come into force and are in UK law.
The Government will continue to review the effectiveness of all the legislation on unmanned aircraft. It is critical to us. We will always listen to new ideas from noble Lords and stakeholders. It is important.
The Science and Technology Committee’s report Commercial and Recreational Drone Use in the UK was mentioned. I note for the record that my department stands ready to provide a response to the report—we have not yet responded—which will include references to the applicability of legislation. We will do that once the committee is reappointed.
On the basis of that explanation I hope that the noble Baroness feels able to withdraw her amendment.
My Lords, I emphasise that my amendment simply asks for a review of the current situation. While the debate has been going on, I have looked through the specifications of modern drones; they include geofencing, altitude limits, return to home, sensor-avoid technology and ADSB in all drones weighing more than 250 grams. There are various ways of controlling them, including not just an app or traditional remote controllers but even hand gestures. We are at a very important point in the development of drones.
On the analogy with registering a car, which I initiated and the Minister took me up on, looking through the CAA’s pages there does not appear to be a requirement for the registered operator to be present when a drone is flown by a child. With all due respect, larger drones, as the noble Lord said earlier, are not toys and have a huge potential impact. I think the Government are guilty of some complacency; they are certainly guilty of being behind the curve. A review would provide a good opportunity for them to come up to speed. However, I beg leave to withdraw my amendment.
My Lords, Amendment 32 follows similar lines to Amendment 31 but is much more specific. It amends the Air Navigation Order 2016 to introduce an obligation for geofencing equipment to be up to date and working. It provides that persons in charge who have electronic identification must not switch it off, and must have that identification on a register linked to their name. Currently, we still have drone users without registered drones. As I said earlier, there are good reasons why some people do not, and should not, have to register; the amendment allows for exceptions.
Basically, I have selected some simple steps that can be taken now. They do not anticipate future technological developments; they deal with what exists now. I accept that one might debate many things about how we control and use drones in the most sensible way, but these are simple, basic improvements to the control of drones by government legislation which benefit the whole of society, as I stated in my previous amendment. I do not wish to repeat what I said then. I beg to move.
My Lords, I have an almost identical amendment to that moved by the noble Baroness, Lady Randerson. I am sure that nobody wishes to hear me deliver virtually the same speech as the one delivered by the noble Baroness. I support what she has said and hope we will find that the Government do too.
I am very pleased that this group came immediately after the previous one because I too will probably be saying pretty much what I said before. Obviously, geo-awareness and electronic conspicuity are important parts of the delegated regulation. Even though the noble Baroness would perhaps like these to be introduced sooner, I am sure she would accept that, while we are in our transition period, we have to follow EU law. The two items identified in this amendment are already in UK law; there is a three-year transition period in which they will come into effect. The noble Baroness mentioned that new drones can be purchased with all these things. There are people in the model aircraft community who will be very quick to write to all noble Lords to tell them why the transition period of three years is required. I have been at the receiving end of one their campaigns; it involves a lot of letters.
There are many reasons for the three-year transition period. While we were a member of the EU we could not change it, as the noble Baroness, being a Liberal Democrat, well knows. Those two requirements are already there so, from the point of view of the amendments, we can put them to one side. I have been through the registration issue several times: there is an operator and there is a remote pilot; the remote pilot is under the responsibility of the operator and can be under 18. It is nobody’s interest to stop people under 18—a 16 year-old, for example—flying these vehicles.
On remote identification, once electronic conspicuity is ubiquitous, we will be able to link the identifier to the registration system. At the moment, there is literally a physical number on a drone; that will change over to electronic conspicuity once the transition period is over. The model aircraft people will have put electronic conspicuity into all their aircraft by then and the entire system should be ready to go. I hope that, given this explanation, the noble Baroness will feel able to withdraw her amendment.
My Lords, in moving Amendment 33A, I will speak also to Amendments 33B and 33C in this group. I declared earlier that I am vice-president of BALPA. It will come as no great surprise that these amendments emanate largely from BALPA. It supports the general direction of the Bill and wishes to see it enacted. It recognises that there are additional issues that will have to be addressed by the Government subsequently, but in the immediate term some of the enforcement measures, and the description of what needs to be enforced, can be clearer and more effective, and these three amendments pick out three of those. They do so in a way that does not leave to secondary legislation a description or an invention of another criminal offence. They would put it in primary legislation in the context of this Bill and of the various Acts, such as the Air Navigation Order and the police powers Acts, that already exist.
My Lords, I offer my apologies as I was not able to be here for Second Reading, which I know traditionally one is before one speaks. I draw noble Lords’ attention to my entry in the register, which lists me as the president of BALPA, an office that I am very pleased to fulfil.
I support the points made by the noble Lord, Lord Whitty. These are basically safety amendments. We are looking for a positive statement from the Government, which I am sure will be forthcoming. Amendment 33A, as the noble Lord has said, is about the safety features being inoperable. We are particularly concerned if they are disabled deliberately. Of course, sometimes they are inoperable because they just do not work but on other occasions they can be deliberately disabled, and clearly that should not be allowed.
Amendment 33B says a single person can operate only a single drone at any one time. That we see as a matter of basic safety, and we hope it will find favour. On Amendment 33C, as the noble Lord has said, regulations concerning drugs and alcohol are fairly common in industry and in all these situations. I hope the Minister will feel able to give a positive response to the amendments and read into the record the Government’s support for at least the intention of what we are seeking to do.
My Lords, I too support the thrust of these three amendments. On the first of them I would need to be quite clear, though, whether the requirements of particular safety features are a legal requirement. If they are not, I believe that they should be; but I assume that they are, which is why they are mentioned in this way. I also note in passing that the phrase
“in charge of a small unmanned aircraft”
is used. We have been talking about various ways in which those aircraft are managed. Is there somebody controlling them or are they being operated? For the sake of clarity, if we are going to use a word such as “controlling”—or any other word—it should be part of the legislation to define what is meant by the phrase or phrases that are used in it.
The amendment regarding one single unmanned aircraft could be restrictive but, to start with, that is perhaps the right way to go—not to immediately talk about allowing two or more, or even a swarm, of small unmanned aircraft to be flown. In passing, if such an arrangement were allowed would the collective weight of the swarm be taken into account, rather than just the weight of an item within that swarm? That could affect it, bearing in mind the weight limitations that are already in legislation.
On the point of the third amendment, alcohol, I know that the Minister talked about alcohol in the letter that she wrote. She said that if it were necessary, it would be a matter for an air navigation order because alcohol and drugs are of such significance in the safety of aviation. The Explanatory Notes refer to anybody fulfilling an aviation function, but surely the operator or controller—the man, woman or child in charge of a small, unmanned aircraft—is performing an aviation function. The Railways and Transport Safety Act 2003 seems a very appropriate place for alcohol and drugs to be covered, rather than leaving it to an air navigation order.
My Lords, I add my support for these amendments, particularly Amendment 33C. Perhaps my noble friend the Minister needs to go no further than to look at the provisions and requirements in the armed services for those who are engaged in the use of drones. Although the rules here will presumably apply to civilians, those provisions are sensible in regard to the questions of alcohol and drugs, and of control. Maybe she could find the precedent that she needs if she looks at the service agreements for those involved with operating drones in the services.
My Lords, I certainly support the thrust of what the noble Lord, Lord Whitty, seeks to achieve with his series of amendments but there are perhaps dangers in them as well, considering how these aircraft might be utilised in the future. We are back to the central difficulty with the Bill: how to future-proof it. There could be circumstances in the future where a system of small, unmanned aerial vehicles is used for inspecting pipelines, patrolling beaches—looking for those who are smuggling or bringing in illegal immigrants—or monitoring weather conditions. All sorts of things could require a system of small UAs to be operated. It is entirely conceivable and technologically possible that they could be operated at the moment by computer systems: by algorithms with a single, nominated person in charge of a system of multiple vehicles. That might be much safer than having someone with little experience looking out of the window and trying to control a single aircraft. While I sympathise with the thrust of the amendments, when my noble friend comes to her response perhaps she might care to address that point. The noble Lord, Lord Whitty, might think about it as well.
My Lords, I support these amendments. There is a contradiction at the heart of all the discussion here. Where the Minister sees youngsters having fun and flying a modern version of a model aircraft, others across the House see drones as highly technologically advanced and hugely important to our economy. We see all sorts of aspects of safety and security for the country, as drones are already misused on a fairly wide scale in certain circles. The clue is in the name. The Government call them “small unmanned aircraft”—I would rather they had used “uncrewed aircraft” as going back to the concept of “manning”, which we got out of legislation some years ago, is rather depressing, but that is beside the point. The point here is that the Government are calling them “small unmanned aircraft” and, therefore, the rules associated with aircraft need to apply. That you might have had too much to drink or might be high is now considered totally unacceptable in respect of other functions, so the noble Lord is drawing attention to some basic, sensible rules about how drones should be used. That is not to be overly onerous, because one person’s risk is another’s terrible danger. We have to be sensible about the implications for safety in this field.
I thank the noble Lord, Lord Whitty, for tabling these important safety amendments. I will take a moment to rebut the noble Baroness, Lady Randerson, who seems to imply that, for some reason, the Government do not care about safety. Continually her remarks seem to imply, “Well, we see the danger and the Government do not.” The Government do see the danger and are looking at all ways to mitigate it, while not crushing an industry that could be incredibly important to our nation and its future.
I shall address in detail the three amendments tabled by the noble Lord, but I want to reassure him and noble friends on the Benches behind me that the Government feel that maintaining the highest standards of safety is a top priority, in relation to both manned and unmanned aircraft. That is why failing to meet requirements such as being reasonably satisfied that a flight can safely be made are already offences under the Air Navigation Order. More serious offences such as endangering the safety of an aircraft could also apply.
For example, Amendment 33A refers to “inbuilt safety features”. They are not necessarily defined, but I take it that we should talk about the thrust of the amendment rather than the detail. As has been covered several times today, the EU regulations being transposed into UK law cover much of what is covered by the noble Lord’s first amendment. The inbuilt safety features to which I think he is referring, such as electronic conspicuity, are within that. The noble Lord mentioned that they could not be turned off—indeed they cannot, because should they be turned off that would be illegal, as the devices would then not have electronic conspicuity. Under the regulations in place—we are in the transition period—those things would have to be on and functioning. Turning them off would not be an option, because that would then be illegal.
On being under the influence of drugs or alcohol, again, this is a really important area. Under the Air Navigation Order, for any remote pilot—that is, the person flying it rather than the person who takes responsibility for it or owns it—who flies a small unmanned aircraft without being reasonably satisfied that the flight can safely be made, perhaps because they are under the influence of drugs or alcohol, there is a potential fine on conviction of up to £2,500. For further, more serious cases of unsafe flying, a pilot found guilty of recklessly or negligently causing an aircraft to endanger a property or person could be sentenced to up to two years in prison, which is quite a significant sentence for being over the limit.
However, I want to bring to noble Lords’ attention more specific regulation: that is, the implementing regulations. I have talked a lot today about delegated regulation today; there is also the implementing regulation, which is also coming from the EU. That states specifically that a remote pilot must not fly an unmanned aircraft when under the influence of psychoactive substances or alcohol.
Therefore, while I accept that the noble Lord’s intention is to make safety changes—and safety is our highest priority—I hope that I have been able to convince the noble Lord, at least for the time being, that we already cover the issues that he hoped to raise.
My Lords, I thank the Minister for her support for the intention of the amendments. On the third amendment, on alcohol and drugs, whether or not the matter is covered by EU regulations in one sense, it is important that operators of drones understand that they should be under the same degree of discipline and self-control as pilots. It is therefore important that it appears in the same place in primary legislation. I am grateful to the Minister for spelling out that there is implementing legislation as well as the initial transposed EU legislation, which may make that clearer—but, even so, it is important that people on the ground do not regard themselves as being in a different category from those in control of aircraft in the air. I do not therefore completely accept that the matter is already covered.
On the first amendment, I say in reply to the noble and gallant Lord, Lord Craig, that, clearly, we are talking about the legally required safety regulations. Again, I hope that the Minister’s assurance that this matter is already covered stands up and I would welcome that being spelled out in letters that I could share with my colleagues. We will see whether we need to come back on that.
On single operatives, I accept, as I said in opening, that technology may get us to a situation where, for certain specific purposes, there is a single controller of a number of machines. I think that that should be dealt with as an exception, however, so that if an inspection company for a pipeline or a navigation, or for land management purposes, wants to use a single controller for several drones that are all doing the same task, or different aspects of the same task, that should probably be dealt with under an exceptional licence.
The principle should be that there should be one pilot for one machine, which is what this would require. The Minister did not comment in great detail on that: no doubt she can have another look at it. I am pleased that there seems to be general support for the principle, even if some of it may already be indirectly on the statute book through European legislation. I am very grateful, of course, for the Government’s endorsement of retaining that European legislation, in this field at least. For the moment, however, I beg leave to withdraw the amendment.
This amendment is primarily to ascertain whether the Government believe that there is a risk arising from unmanned aircraft operated from overseas and, if they do, what their strategy is for dealing with it.
At Second Reading, I referred to the power, which we know is in the Bill, allowing a police officer to require a person to ground an unmanned aircraft if they have reasonable grounds for believing that the person is controlling the unmanned aircraft. I asked if there were powers available if the unmanned aircraft were being controlled by a person operating it from outside the United Kingdom or, indeed, from within our coastal waters. It would be helpful if the Government would say whether there is a strategy for managing risks arising from unmanned aircraft operated from overseas. Do they consider there is a risk from this source at all?
I thank the noble Lord, Lord Rosser, for raising this very important point. Certainly, the Government are well aware of a wide range of risks relating to unmanned aircraft and the fact that they may, in due course, be operated from overseas. That is one of the risks we are considering.
The Government published the UK Counter-Unmanned Aircraft Strategy in October 2019. That strategy aims to safeguard the potential benefits of unmanned aircraft—because they can bring substantial benefits to the UK—by setting out our approach for countering the threat posed by their malicious or negligent use. I stress that this is very much work in progress. As all noble Lords have commented today, this technology moves very quickly, but the focus of this strategy is on keeping the UK public safe and protecting our critical national infrastructure, prisons and crowded places, irrespective of where the threat originates, in the UK or externally. It is therefore not necessary to prepare and publish an additional strategy specifically for managing a threat from overseas; it is something that is under consideration and was considered as we prepared the strategy.
As I have said many times today, the strategy recognises that there is no silver bullet: we must look at all the threats and at mitigating them all, both through the Bill before your Lordships today and through more practical elements, such as training the police, making sure that airports have access to the technology, as I explained earlier, and making sure that everybody using the technology or putting these powers in place has the training and guidance needed to respond effectively to the threat. I hope that, based on that explanation, the noble Lord will feel able to withdraw his amendment.
I thank the Minister for her response, and I beg leave to withdraw the amendment.
The two amendments in this group would require the Secretary of State to consult those involved in or affected by the incident at Gatwick Airport in December 2018 and to report on the consultation to both Houses of Parliament. What has driven these amendments more than anything else is that I am still not clear about the extent to which the Government went back to consult those who took part in the original consultation, to see whether they had anything useful to add in light of their experience of what happened at Gatwick in December 2018 that might have had relevance for what appears in the Bill we are considering today. As we know, two public consultations took place prior to this Bill and, indeed, prior to the incident in December 2018.
My noble friend Lord Tunnicliffe referred to this at Second Reading, when he asked whether there had been any consultation on the legislation with those involved in the Gatwick incident. The Government’s response was less than explicit. They said only that there had been contact with the police force
“around Gatwick and … all over the country”
and meetings with
“other stakeholders to discuss these matters in general.”
The Government also said that
“a cross-government working group … looked at stop and search powers”
and
“agreed that the focus of the powers should not only be directed towards aviation and airports but be applicable to other areas such as prisons”.—[Official Report, 27/1/20; cols. 1291-2.]
In conclusion, they said they could not “delay any longer”. One might draw an inference from that comment that few of those organisations or individuals involved or affected by the Gatwick incident were consulted so that their potentially useful recent information or experience could be taken into consideration when determining the provisions that should be in this Bill or what provisions of a non-legislative function might be taken.
I thank the noble Lord, Lord Rosser, for giving me the opportunity to share as much information as I have with him. I will certainly share more if he is still yet to be convinced. As to whether there is a report on Gatwick—my apologies for not covering this earlier—I do not know but will investigate and return to it in a letter to him.
This amendment is on consultation. Ministers and officials from the Department for Transport and the Home Office have engaged with a range of stakeholders throughout the development of this Bill, including but not exclusively those listed in the amendment, and will continue to do so to make sure that our legislation remains fit for purpose, ensuring that lessons learned from those directly involved in responding to unmanned aircraft incidents, whether Gatwick or others, are considered and acted upon.
In the aftermath of the Gatwick incident, the Government worked with the police, the airport and other relevant organisations to learn lessons from the response. There were debriefs, workshops and future planning meetings so that we could look at and extrapolate from the event. Since Gatwick, the counter-drone community has moved forward at pace. We have a broader understanding of the threat posed by drones—hence our work with the CPNI on detecting, tracking and identifying equipment and how that might be deployed. We also continue to consult widely. For example, the UK Counter-Unmanned Aircraft Strategy, our main focus following Gatwick and prior to this Bill, was published in October 2019 and followed ongoing engagement with both those on and not on the list because we wanted the widest input we could get.
I turn to some of the specific bodies: first, the police. For the first few months after the Gatwick incident, the counter-drone unit in the Home Office, which worked jointly with my department on this Bill, had an embed in its team from Sussex Police who was involved with Gatwick. That was extremely helpful. Since May 2019, a chief inspector from the National Police Chiefs’ Council has been embedded in this team with the national police lead for counter-drone systems, providing operational advice on how the provisions in the Bill will be put to use on the ground.
We see Gatwick Airport regularly and seek regular input from all airports because it is often the case that the larger airports will be able to react in a very different way to the smaller airports—something we have not really touched on today.
At the time, a key issue revealed by Gatwick was the question of who was responsible for the operation of equipment. That has been clarified, as the Minister has indicated, in relation to the larger airports. Have the Government yet reached agreement with smaller airports, police services and the Army throughout Britain on who is responsible for ensuring that appropriate equipment will be deployed at smaller airports if such an incident happens there?
The noble Baroness has hit a particular nail on the head. That is why the catalogue of equipment is being developed by the CPNI. It is encouraging the leasing of equipment. Airports are responsible for safety and security within their boundaries, so they are being encouraged, where they feel it is appropriate, to lease appropriate equipment. Not all airports are the same, because of different sized sites and all sorts of different reasons. There is always ongoing engagement with the Ministry of Defence and the police. Every incident is dealt with on a case-by-case basis because, interestingly, no two incursions are the same. Some can be dealt with extremely easily and others require a different approach. We are well aware of the difference.
It is not just the different sizes of airports. There are various other bits of critical national infrastructure that fall under this entire threat picture. We are cognisant of that; it is part of the work on the strategy to make sure that we have the appropriately flexible response to make sure that we can deploy resources in the best way.
We have also been engaging with the Ministry of Defence. Along with the Home Office, my department works closely with the Ministry of Defence to share learning from its military work overseas and how best to work with the counter-drone industry. We work closely with the Civil Aviation Authority, including on the development of the drone code and drone registration scheme. Since Gatwick, the code has been reviewed and the drone registration scheme has come into existence.
We have regular meetings with BALPA, which is always a pleasure, and we are very interested in what it has to say. We also see a wide range of other bodies, either regularly or on an ad hoc basis, which includes the drone and counter-drone industries, regulatory bodies, airports and other critical national infrastructure sites, academia, and in particular international partners— this is not just a UK issue, and we speak to our international colleagues about it. I had a meeting with people from the States just a couple of weeks ago; they are facing the same problems, and we should not think that we are behind the curve, because we are certainly not.
I hope that, based on that explanation, the noble Lord will feel able to withdraw his amendment.
I thank the Minister for her response, and I beg leave to withdraw the amendment.
(4 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government what steps they have taken to assess the full implications of decision-making and prediction by algorithm in the public sector.
My Lords, first, a big thank you to all noble Lords who are taking part in the debate this evening.
Over the past few years we have seen a substantial increase in the adoption of algorithmic decision-making—ADM—and prediction across central and local government. An investigation by the Guardian last year showed that some 140 of 408 councils in the UK are using privately developed algorithmic “risk assessment” tools, particularly to determine eligibility for benefits and to calculate entitlements. Data Justice Lab research in late 2018 showed that 53 out of 96 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment and assistance in decision-making. In particular, we have the Harm Assessment Risk Tool—HART—system used by Durham police to predict reoffending, which was shown by Big Brother Watch to have serious flaws in the way the use of profiling data introduces bias and discrimination and dubious predictions.
Central government use is more opaque, but HMRC, the Ministry of Justice and the DWP are the highest spenders on digital, data and algorithmic services. A key example of ADM use in central government is the DWP’s much-criticised universal credit system, which was designed to be digital by default from the beginning. The Child Poverty Action Group, in its study, Computer Says “No!”, shows that those accessing their online account are not being given adequate explanation as to how their entitlement is calculated.
The UN special rapporteur on extreme poverty and human rights, Philip Alston, looked at our universal credit system a year ago and said in a statement afterwards:
“Government is increasingly automating itself with the use of data and new technology tools, including AI. Evidence shows that the human rights of the poorest and most vulnerable are especially at risk in such contexts. A major issue with the development of new technologies by the UK government is a lack of transparency.”
These issues have been highlighted by Liberty and Big Brother Watch in particular.
Even when not using ADM solely, the impact of an automated decision-making system across an entire population can be immense in terms of potential discrimination, breach of privacy, access to justice and other rights. Last March, the Committee on Standards in Public Life decided to carry out a review of AI in the public sector to understand its implications for the Nolan principles and to examine whether government policy is up to the task of upholding standards as AI is rolled out across our public services. The committee chair, the noble Lord, Lord Evans of Weardale, said on publishing the report this week:
“Demonstrating high standards will help realise the huge potential benefits of AI in public service delivery. However, it is clear that the public need greater reassurance about the use of AI in the public sector. Public sector organisations are not sufficiently transparent about their use of AI and it is too difficult to find out where machine learning is currently being used in government.”
It found that despite the GDPR, the data ethics framework, the OECD principles and the guidelines for using artificial intelligence in the public sector, the Nolan principles of openness, accountability and objectivity are not embedded in AI governance in the public sector, and should be.
The committee’s report presents a number of recommendations to mitigate these risks, including greater transparency by public bodies in the use of algorithms, new guidance to ensure that algorithmic decision-making abides by equalities law, the creation of a single coherent regulatory framework to govern this area, the formation of a body to advise existing regulators on relevant issues, and proper routes of redress for citizens who feel decisions are unfair.
It was clear from the evidence taken by our own AI Select Committee that Article 22 of the GDPR, which deals with automated individual decision-making, including profiling, does not provide sufficient protection for those subject to ADM. It contains a right to explanation provision when an individual has been subject to fully automated decision-making, but few highly significant decisions are fully automated. Often it is used as a decision support; for example, in detecting child abuse. The law should also cover systems where AI is only part of the final decision.
The May 2018 Science and Technology Select Committee report, Algorithms in Decision-Making, made extensive recommendations. It urged the adoption of a legally enforceable right to explanation that would allow citizens to find out how machine learning programs reach decisions that affect them and potentially challenge the results. It also called for algorithms to be added to a ministerial brief and for departments to publicly declare where and how they use them. Subsequently, a report by the Law Society published last June about the use of Al in the criminal justice system expressed concern and recommended measures for oversight, registration and mitigation of risks in the justice system.
Last year, Ministers commissioned the AI adoption review, which was designed to assess the ways that artificial intelligence could be deployed across Whitehall and the wider public sector. Yet the Government are now blocking the full publication of the report and have provided only a heavily redacted version. How, if at all, does the Government’s adoption strategy fit with the publication last June by the Government Digital Service and the Office for Artificial Intelligence of guidance for using artificial intelligence in the public sector, and then in October further guidance on AI procurement derived from work by the World Economic Forum?
We need much greater transparency about current deployment, plans for adoption and compliance mechanisms. In its report last year entitled Decision-making in the Age of the Algorithm, NESTA set out a comprehensive set of principles to inform human/machine interaction for public sector use of algorithmic decision-making which go well beyond the government guidelines. Is it not high time that a Minister was appointed, as was also recommended by the Commons Science and Technology Select Committee, with responsibility for making sure that the Nolan standards are observed for algorithm use in local authorities and the public sector and that those standards are set in terms of design, mandatory bias testing and audit, together with a register for algorithmic systems in use—
Could the noble Lord extend what he has just asked for by saying that the Minister should also cover those areas where algorithms defeat government policy and the laws of Parliament? I point by way of example to how dating agencies make sure that Hindus of different castes are never brought together. The algorithms make sure that that does not happen. That is wholly contrary to the rules and regulations we have and it is rather important.
My Lords, I take entirely the noble Lord’s point, but there is a big distinction between what the Government can do about the use of algorithms in the public sector and what the private sector should be regulated by. I think that he is calling for regulation in that respect.
All the aspects that I have mentioned are particularly important for algorithms used by the police and the criminal justice system in decision-making processes. The Centre for Data Ethics and Innovation should have an important advisory role in all of this. If we do not act, the Legal Education Foundation advises that we will find ourselves in the same position as the Netherlands, where there was a recent decision that an algorithmic risk assessment tool called SyRI, which was used to detect welfare fraud, breached Article 8 of the European Convention on Human Rights.
There is a problem with double standards here. Government behaviour is in stark contrast to the approach of the ICO’s draft guidance, Explaining Decisions Made with AI, which may meet the point just made by the noble Lord. Last March, when I asked an Oral Question on this subject, the noble Lord, Lord Ashton of Hyde, ended by saying
“Work is going on, but I take the noble Lord’s point that it has to be looked at fairly urgently”.—[Official Report, 14/3/19; col. 1132.]
Where is that urgency? What are we waiting for? Who has to make a decision to act? Where does the accountability lie for getting this right?
My Lords, I congratulate the noble Lord, Lord Clement-Jones, on securing this important debate. It is a topic that I know is close to his heart. I had the privilege of serving on the Select Committee on Artificial Intelligence which he so elegantly and eloquently chaired.
Algorithmic decision-making has enormous potential benefits in the public sector and it is therefore good that we are seeing growing efforts to make use of this technology Indeed, only last month, research was published showing how AI may be useful in making screening for breast cancer more efficient. The health sector has many such examples but algorithmic decision-making is showing potential in other sectors too.
However, the growing use of public sector algorithmic decision-making also brings challenges. When an algorithm is being used to support a decision, it can be unclear who is accountable for the outcome. Who is the front-line decision-maker? Is it the administrator in charge of the introduction of the Al tool, or perhaps the private sector developer? We must make sure that the lines of accountability are always clear. With more complex algorithmic decision-making, it can be unclear why a decision has been made. Indeed, even the public body making the decision may be unable to interrogate the algorithm being used to support it. This threatens to undermine good administration, procedural justice and the right of individuals to redress and challenge. Finally, using past data to drive recommendations and decisions can lead to the replication, entrenchment and even the exacerbation of unfair bias in decision-making against particular groups.
What is at stake? Algorithmic decision-making is a general-purpose technology which can be used in almost every sector. The challenges it brings are diverse and the stakes involved can be very high indeed. At an individual level, algorithms may be used to make decisions about medical diagnosis and treatment, criminal justice, benefits entitlement or immigration. No less important, algorithmic decision-making in the public sector can make a difference to resource allocation and policy decisions, with widespread impacts across society.
I declare an interest as a board member of the Centre for Data Ethics and Innovation. We have spent the last year conducting an in-depth review into the specific issue of bias in algorithmic decision-making. We have looked at this issue in policing and in local government, working with civil society, central government, local authorities and police forces in England and Wales. We found that there is indeed the potential for bias to creep in where algorithmic decision-making is introduced, but we also found a great deal of willingness to identify and address these issues.
The assessment of consequences starts with the public bodies using algorithmic decision-making. They want to use new technology responsibly, but they need the tools and frameworks to do so. The centre developed specific guidance for police forces to help them trial data analytics in a way that considers the potential for bias—as well as other risks—from the outset. The centre is now working with individual forces and the Home Office to refine and trial this guidance, and will be making broader recommendations to the Government at the end of March.
However, self-assessment tools and a focus on algorithmic bias are only part of the answer. There is currently insufficient transparency and centralised knowledge about where high-stakes algorithmic decision-making is taking place across the public sector. This fuels misconceptions, undermines public trust and creates difficulties for central government in setting and implementing standards for the use of data-driven technology, making it more likely that the technology may be used in unethical ways.
The CDEI was pleased to contribute to the recently published report from the Committee on Standards in Public Life’s AI review, which calls for greater openness in the use of algorithmic decision-making in the public sector. It also is right that the report calls for a consistent approach to formal assessment of the consequences of introducing algorithmic decision-making and independent mechanisms of accountability. Developments elsewhere, such as work being done in Canada, show how this may be done.
The CDEI’s new work programme commences on 1 April. It will be proposing a programme of work exploring transparency standards and impact assessment approaches for public sector algorithmic decision-making. This is a complex area. The centre would not recommend new obligations for public bodies lightly. We will work with a range of public bodies to explore possible solutions that will allow us to know where important decisions are being algorithmically supported in the public sector, and consistently and clearly assess the impact of those algorithms.
There is a lot of good work on these issues going on across government. It is important that we all work together to ensure that these efforts deliver the right solutions.
My Lords, as we have six minutes, let me also congratulate the noble Lord, Lord Clement-Jones, on having introduced this debate so ably and say what an excellent and, if I might say so, affable chairman he was of the AI committee.
AI and machine learning are on the front line of our lives wherever we look. The centre for disease control in Zhejiang province in China is deploying AI to analyse the genetic composition of the coronavirus. It has shortened a process that used to take many days to 30 minutes. Yet we—human beings—do not know how exactly that outcome was achieved. The same is true of AlphaGo Zero, which famously trained itself to beat the world champion at Go, with no direct human input whatever. That borders on what the noble Baroness, Lady Rock, said. Demis Hassabis, who created the system, said that AlphaGo Zero was so powerful because it was
“no longer constrained by the limits of human knowledge.”
That is a pretty awesome statement.
How, therefore, do we achieve accountability, as the Commons report on algorithms puts it, for systems whose reasoning is opaque to us but that are now massively entwined in our lives? This is a huge dilemma of our times, which goes a long way beyond correcting a few faulty or biased algorithms.
I welcome the Government’s document on AI and the public sector, which recognises the impact of deep learning and the huge issues it raises. California led the world into the digital revolution and looks to be doing the same with regulatory responses. One proposal is for the setting up of public data banks—data utilities—which would set standards for public data and, interestingly, integrate private data accumulated by the digital corporations with public data and create incentives for private companies to transfer private data to public uses. There is an interesting parallel experiment going on in Toronto, with Google’s direct involvement. How far are the Government tracking and seeking to learn from such innovations in different parts of the world? This is a global, ongoing revolution.
Will the Government pay active and detailed attention to the regulation of facial recognition technology and, again, look to what is happening elsewhere? The EU, for example—with which I believe we used to have some connection—is looking with some urgency at ways of imposing clear limits on such technology to protect the privacy of citizens. There is a variety of cases about this where the Information Commissioner, Elizabeth Denham, has expressed deep concern.
On a more parochial level, noble Lords will probably know about the furore around the use of facial recognition at the King’s Cross development. The cameras installed by the developer at the site incorporated facial recognition technology. Although limited in nature, it had apparently been in use for some while.
The surveillance camera code of practice states:
“There must be as much transparency in the use of a surveillance camera system as possible”.
That is not the world’s most earth-shattering statement, but it is important. The code continues by saying that clear justification must be offered. What procedures are in place across the country for that? I suspect that they are pretty minimal, but this is an awesome new technology. If you look across the world, you can see that authoritarian states have an enormous amount of day-to-day data on everybody. We do not want that situation reproduced here.
The new Centre for Data Ethics and Innovation appears to have a pivotal role in the Government’s thinking. However, there seems to be rather little detail about it so far. What is the timetable? How long will the consultation period last? Will it have regulatory powers? That is pretty important. After all, the digital world moves at a massively fast pace. How will we keep up?
Quite a range of bodies are now concerned with the impact of the digital revolution. I congratulate the Government on that, because it is an achievement. The Turing Institute seems well out in front in terms of coherence and international reputation. What is the Minister’s view of its achievements so far and how do the Government see it meshing with this diversity of other bodies that—quite rightly—have been established?
My Lords, I thank my noble friend for bringing this subject to our attention. The noble Lord, Lord Giddens, went for the big picture; I will, rather unashamedly, go back to a very small part of it.
Bias in an algorithm is quite clearly there because it is supposed to be there, from what I can make out. When I first thought about the debate, I suddenly thought of a bit of work I did about three years ago with a group called AchieveAbility. It was about recruitment for people in the neurodiverse categories—that is, those with dyslexia, dyspraxia, autism and other conditions of that nature. These people had problems with recruitment. We went through things and discovered that they were having the most problems with the big recruitment processes and the big employers, because they had isometric tests and computers and things and these people did not fit there. The fact is that they processed information differently; for example, they might not want to do something when it came round. This was especially true of junior-level employment. When asked, “Can you do everything at the drop of a hat at a low level?”, these people, if they are being truthful, might say, “No”, or, “I’ll do it badly or slowly.”
The minute you put that down, you are excluded. There may be somewhere smaller where they could explain it. For instance, when asked, “Can you take notes in a meeting?”, they may say, “Not really, because I use a voice-operated computer and if I talk after you talk, it’s going to get a bit confusing.” But somebody else may say, “Oh no, I’m quite happy doing the tea.” In that case, how often will they have to take notes? Probably never. That was the subtext. The minute you dump this series of things in the way of what the person can do, you exclude them. An algorithm—this sort of artificial learning—does not have that input and will potentially compound this problem.
This issue undoubtedly comes under the heading of “reasonable adjustment”, but if people do not know that they have to change the process, they will not do it. People do not know because they do not understand the problem and, probably, do not understand the law. Anybody who has had any form of disability interaction will have, over time, come across this many times. People do it not through wilful acts of discrimination but through ignorance. If you are to use recruitment and selection processes, you have to look at this and build it in. You have to check. What is the Government’s process for so doing? It is a new field and I understand that it is running very fast, but tonight, we are effectively saying, “Put the brakes on. Think about how you use it correctly to achieve the things we have decided we want.”
There is positive stuff here. I am sure that the systems will be clever enough to build in this—or something that addresses this—in future, but not if you do not decide that you have to do it. Since algorithms reinforce themselves, as I understand it, it is quite possible that you will get a barrage of good practice in recruitment that gives you nice answers but does not take this issue into account. You will suddenly have people saying, “Well, we don’t need you for this position, then.” That is 20% of the population you can ignore, or 20% who will have to go round the sides. We really should be looking at this. As we are looking at the public sector here, surely the Government, in their recruitment practices at least, should have something in place to deal with this issue.
I should declare my interests. I am dyslexic. I am the president of the British Dyslexia Association and chairman of a technology company that does the assistive technology, so I have interests here but I also have some knowledge. If you are going to do this and get the best out of it, you do not let it run free. You intervene and you look at things. The noble Lord, Lord Deben, pointed out another area where intervention to stop something that you do not want to happen happening is there. Surely we can hear about the processes in place that will mean that we do not allow the technology simply to go off and create its own logic through not interfering with it. We have to put the brakes on and create some form of direction on this issue. If we do not, we will probably undo the good work we have done in other fields.
My Lords, I declare an interest as a board member of the CDEI and a member of the Ada Lovelace Institute’s new Rethinking Data project. I am also a graduate of the AI Select Committee. I am grateful to the noble Lord, Lord Clement-Jones, for this important debate.
Almost all those involved in this sector are aware that there is an urgent need for creative regulation that realises the benefits of artificial intelligence while minimising the risks of harm. I was recently struck by a new book by Brad Smith, the president of Microsoft, entitled Tools and Weapons—that says it all in one phrase. His final sentence is a plea for exactly this kind of creative regulation. He writes:
“Technology innovation is not going to slow down. The work to manage it needs to speed up.”
Noble Lords are right to draw attention to the dangers of unregulated and untested algorithms in public sector decision-making. As we have heard, information on how and where algorithms are used in the public sector is relatively scant. We know that their use is being encouraged by government and that such use is increasing. Some practice is exemplary, while some sectors have the feel of the wild west about them: entrepreneurial, unregulated and unaccountable.
The CDEI is the Government’s own advisory body on AI and ethics, and is committed to addressing and advising on these questions. A significant first task has been to develop an approach founded on clear, high-level ethical principles to which we can all subscribe. The Select Committee called for this principle-centred approach in our call for an AI code, and at the time we suggested five clear principles. The Committee on Standards in Public Life has now affirmed the need for this high-level ethical work and has called for greater clarity on these core principles. I support this call. Only a principled approach can ensure consistency across a broad and diverse range of applications. The debate about those principles takes us to the heart of what it means to be human and of human flourishing in the machine age. But which principles should undergird our work?
Last May the UK Government signed up to the OECD principles on artificial intelligence, along with all other member countries. The CDEI has informally adopted these principles in our own work. They are very powerful and, I believe, need to become our reference point in every piece of work. They are: AI should benefit people and the planet by driving inclusive growth, sustainable development and well-being; AI systems should be designed in a way that respects the rule of law, human rights, democratic values and diversity; AI should be transparent so that people understand AI-based outcomes and can challenge them; AI systems must function in a robust, secure and safe way; and organisations and individuals developing, deploying or operating AI systems should be held accountable for their proper functioning.
In our recent recommendations to the Government on online targeting, the CDEI used the OECD principles as a lens to identify the nature and scale of the ethical problems with how AI is used to shape people’s online experiences. The same principles will flow through our second major report on bias in algorithmic decision-making, as the noble Baroness, Lady Rock, described.
Different parts of the public sector have codes of ethics distinctive to them. Developing patterns of regulation for different sectors will demand the integration of these five central principles with existing ethical codes and statements in, for example, policing, social work or recruitment.
The application of algorithms in the public sector is too wide a set of issues for a single regulator or to be left unregulated. We need core values to be translated into effective regulation, standards and codes of practice. I join others in urging the Government to work with the CDEI and others to clarify and deploy the crucial principles against which the public-centred use of AI is to be assessed, and to expand the efforts to hold public bodies and the Government themselves to account.
My Lords, I also thank the noble Lord, Lord Clement-Jones, for securing this timely and important debate. It is over only the last 20 years that we have seen the meteoric growth of artificial intelligence. When I was discussing this with a friend of mine, his response was: “What, only 20 years? I’ve got socks older than that.” That is probably too much information—I accept that—but there is no doubt that the use of this kind of AI-driven data is still very new.
The use of such technologies was still the stuff of science fiction when I was first elected as a district councillor in the West Midlands. When I was chancellor of Bournemouth University, the impact of data analytics was very apparent to me. It was my privilege in 1996 to present the Bill that established the use of the UK’s first ever DNA database. As vice-president of the British film board for 10 years, I saw the way in which AI simply transformed what we all see on our computer and cinema screens.
I was recently honoured to chair the Westminster Media Forum conference looking at online data regulation. A major theme of the conference was the need to balance—it is a difficult balance—the opportunities provided by these new technologies and the risks of harming the very people this is supposed to help.
The next decade will be like a “Strictly Come Dancing” waltz between democracy and technocracy. There has to be a partnership between government leaders and the tech company executives, with ethics at the centre. As the noble Lord, Lord Clement-Jones, said, one in three councils uses this AI-driven data to make welfare decisions, and at least a quarter of police authorities now use it to make predictions and risk assessments.
There are examples of good practice. I was born and raised in a part of the world universally regarded as paradise. It is called Birmingham—just off the M6 motorway by the gasworks.
I see there is a consensus there, and I am grateful.
I am pleased that all seven local authorities in the West Midlands Combined Authority have appointed a digital champion and co-ordinator, but in other areas evidence is emerging that some of the systems used by councils are unreliable. This is very serious, because these procedures are used to deploy benefit claims, prevent child abuse and even allocate school places.
Concerns have been raised by campaign groups such as Big Brother Watch about privacy and data security, but I am most worried about the Law Society’s concerns. It has highlighted the problems caused by biased data-based profiling of whole inner-city communities when trying to predict reoffending rates and anti-social behaviour. This can cause bias against black and ethnic minority communities. The potential for unconscious bias has to be taken very seriously.
As far as the National Health Service is concerned, accurate data analysis is clearly a valuable tool in serving the needs of patients, but according to a Health Foundation report of only last year, we are not investing in enough NHS data analysts. That surely is counterproductive.
I would like the Minister to answer some questions. Who exactly is responsible for making sure that standards are set and regulated for AI data use in local authorities and the public sector? Will it be Ofcom, as the new internet regulator, the Biometrics Commissioner or the Information Commissioner’s Office? Who will take responsibility? What protection is there in particular to safeguard the data of children and other groups, such as black and ethnic minorities? What are the Government planning to do about facial recognition systems, which are basically inaccurate? That is really quite frightening when you think about it.
AI and data technology are advancing so fast that the Government are essentially reactive, not proactive. Let us face it: Parliament still uses procedures set down in the 18th century. It took the Government three and a half years to pass the Brexit Bill, whereas it can take less than three and a half seconds for somebody to give consent, by the click of a mouse, to their personal data being stored and shared on the world wide web.
I do not think we should be in awe of AI, because ai is also the name of a small three-toed sloth that inhabits the forests of South America. The ai eats tree leaves and makes a high-pitched cry when disturbed.
Seriously, it is vital that there is co-ordination between national government, local authorities, academic research, industry and the media. At the heart of government data policy must be ethics. Regulation must not stifle innovation, but support it. We are at the start of an exciting new decade of 2020 vision, where democracy and technocracy must be in partnership. You cannot shake hands with a clenched fist.
My Lords, it is a pleasure to follow the noble Lord. At the heart of his speech he made a point that I violently agree with: the pace of science and technology is utterly outstripping the ability to develop public policy to engage with it. We are constantly catching up. This is not a specific point for this debate, but it is a general conclusion that I have come to. We need to reform the way in which we make public policy to allow the flexibility, within the boxes of what is permitted, for advances to be made, but to remain within a regulated framework. But perhaps that is a more general debate for another day.
I am not a graduate of the Artificial Intelligence Select Committee. I wish I had been a member of it. When its very significant and widely recognised as great report was debated in your Lordships’ House, I put my name down to speak. I found myself in a very small minority of people who had not been a member of the committee, but I did it out of interest rather than knowledge. It was an extraordinary experience. I learned an immense amount in a very short time in preparing a speech that I thought would keep my end up among all the people who had spent all this time involved in the subject. I did the same when I saw that the noble Lord, Lord Clement-Jones, had secured this debate, because I knew I was guaranteed to learn something. I did, and I thank him for his consistent tutoring of me by my following his contributions in your Lordships’ House. I am extremely grateful to him that he secured this debate, as the House should be.
I honestly was stunned to see the extensive use of artificial intelligence technology in the public services. There is no point in my trying to compete with the list of examples the noble Lord gave in opening the debate so well. It is being used to automate decision processes and to make recommendations and predications in support of human decisions—or, more likely in many cases, human decisions are required in support of its decisions. A remarkable number of these decisions rely on potentially controversial data usage.
That leads me to my first question for the Minister. To what extent are the Government—who are responsible for all of this public service in accountability terms—aware of the extent to which potentially controversial value judgments are being made by machines? More importantly, to what degree are they certain that there is human oversight of these decisions? Another element of this is transparency, which I will return to in a moment, but in the actual decision-making process, we should not allow automated value judgments where there is no human oversight. We should insist that there is a minimum understanding on the part of the humans of what has promoted that value judgment from the data.
I constantly read examples of decisions being made by artificial intelligence machine learning where the professionals who are following them are unable to explain them to the people whose lives are being affected by them. When they are asked the second question, “Why?”, they are unable to give an explanation because the machine can see something in the data which they cannot, and they are at a loss to understand what it is. In a medical situation, there are lots of black holes in the decisions that are made, including in the use of drugs. Perhaps we should rely on the outcomes rather than always understanding. We probably would not give people drugs if we knew exactly how they all worked.
So I am not saying that all these decisions are bad, but there should be an overarching rule about these controversial issues. It is the Government’s duty at least to know how many of these decisions are being made. I want to hear an assurance that the Government are aware of where this is happening and are happy about the balanced judgments that are being made, because they will have to be made.
I push unashamedly for increased openness, transparency and accountability on algorithmic decision-making. That is the essence of the speech that opened this debate, and I agree 100% with all noble Lords who made speeches of that form. I draw on those speeches and ask the Government to ensure that where algorithms are used, this openness and transparency are required and not just permitted, because, unless it is required, people will not know why decisions about them have been made. Most of those people have no idea how to ask for the openness that they should expect.
My Lords, it is a pleasure to contribute to this debate. Unlike many noble Lords who have spoken, I am not a member of the Select Committee. However, I am a member of the Committee on Standards in Public Life. On Monday, it published its report, Artificial Intelligence and Public Standards. The committee is independent of government. I commend the report to the noble Lord, Lord Browne; he would find many of the questions he posed formulated in it, with recommendations on what should be done next.
The implications of algorithmic decision-making in the public sector for public standards, which is what the Committee has oversight of, are quite challenging. We found that there were clearly problems in the use of AI in delivering public services and in maintaining the Nolan principles of openness, accountability and objectivity. The committee, the Law Society and the Bureau of Investigative Journalism concluded that it is difficult to find out the extent of AI use in the public sector. There is a key role for the Government—I hope the Minister is picking this point up—to facilitate greater transparency in the use of algorithmic decision-making in the public sector.
The problem outlined by the noble Lord, Lord Browne, and others is what happens when the computer says no? There is a strong temptation for the person who is manipulating the computer to say, “The computer made me do it.” So, how does decision-making and accountability survive when artificial intelligence is delivering the outcome? The report of the Committee on Standards in Public Life makes it clear that public officials must retain responsibility for any final decisions and senior leadership must be prepared to be held accountable for algorithmic systems. It should never be acceptable to say, “The computer says no and that is it.” There must always be accountability and, if necessary, an appeals system.
In taking evidence, the committee also discovered that some commercially developed AI systems cannot give explanations for their decisions; they are black box systems. However, we also found that you can make significant progress in making things explainable through AI systems if the public sector which is purchasing those systems from private providers uses its market power to require that.
Several previous speakers have mentioned the problems of data bias, which is a serious concern. Certainly, our committee saw a number of worrying illustrations of that. It is worth understanding that artificial intelligence develops by looking at the data it is presented with. It learns to beat everyone in the world at Go by examining every game that has ever been played and working out what the winning combinations are.
The noble Lord, Lord Taylor, made an important point about facial recognition systems. They are very much better at recognising white faces correctly, rather than generic black faces—they all look the same to them—because the system is simply storing the information it has been given and using it to apply to the future. The example which came to the attention of the committee was job applications. If you give 100 job applications to an AI system and say, “Can you choose suitable ones for us to draw up an interview list?”, it will take account of who you previously appointed. It will work out that you normally appoint men and therefore the shortlist, or the long list, that the AI system delivers will mostly consist of men because it recognises that if it puts women forward, they are not likely to be successful. So, you have to have not only an absence of bias but a clear understanding of what your data will do to the system, and that means you have to have knowledge and accountability. That pertains to the point made by my noble friend Lord Addington about people with vulnerabilities— people who are, let us say, out of the normal but still highly employable, but do not happen to fit the match you have.
So, one of our key recommendations is new guidance on how the Equality Act will apply for algorithmic systems. I am pleased to say that the Equality and Human Rights Commission has offered direct support for our committee’s recommendation. I hope to hear from the Minister that that guidance is in her in tray for completion.
The question was asked: how will anyone regulate this? Our committee’s solution to that problem is to impose that responsibility on all the current regulatory bodies. We did not think that it would be very functional to set up a separate, independent AI regulator which tried to overarch the other regulators. The key is in sensitising, informing and equipping the existing regulators in the sector to deliver. We say there is plenty of scope for some oversight of the whole process, and we very much support the view that the Centre for Data Ethics and Innovation should be that body. There is plenty of scope for more debate, but I hope the Minister will grab hold of the recommendations we have made and push forward with implementing them.
My Lords, I too thank the noble Lord, Lord Clement-Jones, for introducing this topical and very important debate, and I am delighted that we have been given six minutes rather than the previously allotted three.
As the Science and Technology Committee in the other place reported in Algorithms in Decision-making, algorithms have been used for many years to aid decision-making, but the recent huge growth of big data and machine learning has substantially increased decision-making in a number of sectors, not just in the public sector but in finance, the legal system, the criminal justice system, the education system and healthcare. I shall not give examples because of the lack of time.
As every speaker has mentioned, the use of these technologies has proven controversial on grounds of bias, largely because of the algorithm developers’ selection of datasets. The question and challenge is how to recognise bias and neutralise it. In deciding upon the relevance of algorithmic output to a decision by a public sector body, the decision-maker should have the discretion to assess unthought of relevant factors and whether the decision is one for which the algorithm was designed. Clearly there is a need for a defined code of standards for public sector algorithmic decision-making. In this regard, I refer to the recommendations of NESTA, which was mentioned by the noble Lord, Lord Clement-Jones. It recommended that every algorithm used by a public sector organisation should be accompanied by a description of its function, objectives and intended impact. If we are to ask public sector staff to use algorithms responsibly to complement or replace some aspects of their decision-making, it is vital that they have a clear understanding of what they are intended to do and in what context they might be applied.
Given the rising use of algorithms by the public sector, only a small number can be reasonably audited. In this regard, there is a recommendation that every algorithm should have an identical sand-box version for auditors to test the impact of different input conditions. As almost all noble Lords have mentioned, there is a need for more transparency about what data was used to train an algorithm, identifying whether there is discrimination on a person’s ethnicity, religion or other factors, a point most poignantly made by the noble Lord, Lord Taylor. By way of example, if someone is denied council housing or a prisoner is denied probation, they need to know whether an algorithm was involved in that decision. If it is proven that an individual was negatively impacted by a mistaken decision made by an algorithm, a recommendation has been made by NESTA that an insurance scheme should be established by public sector bodies to ensure that citizens can receive appropriate compensation.
I shall keep it brief. In conclusion, I do not want to give the impression that I am opposed to the use of algorithms in the decision-making processes of the public sector. The report on AI by our Select Committee, which was so ably chaired by the noble Lord, Lord Clement-Jones—I was lucky enough to be a member—highlighted the huge benefits that artificial intelligence can provide to the public and private sectors. Can the Minister elaborate on the Government’s adoption strategy? With the vast majority of investments in AI coming from the United States as well as from Japan, I believe the UK should focus its efforts to lead the way in developing ethical and responsible AI.
My Lords, I am glad of the opportunity to take part in this debate. I declare my interests as set out in the register and congratulate my friend, the noble Lord, Lord Clement-Jones, on securing the debate. The only difficulty in speaking at this stage is that we are rightly and rapidly running out of superlatives for him. I shall merely describe him as the lugubrious, fully committed, credible and convivial noble Lord, Lord Clement-Jones.
AI has such potential and it is absolutely right that it is held to a higher standard. In this country—somewhat oddly, I believe—we currently allow thousands of human driver-related deaths on our roads. It is right that any autonomous vehicle is held to a kill rate of zero. But what does this mean in the public sector, in areas such health, welfare and defence? As the noble Lord, Lord Clement-Jones, set out, over a third of our local authorities are already deploying AI. This is not something for the future. It is absolutely for the now. None of us can afford to be bystanders, no matter how innocent. Everybody has a stake, and everybody needs to have a say.
I believe the technology has such potential for the good, not least for the public good—but it is a potential, not an inevitability. This is why I was delighted to see the report by the Committee on Standards in Public Life published only two days ago, to which the noble Lord, Lord Stunell, referred. I support everything set out in that report, not least its reference to the three critical Nolan principles. I restrict my comments to what the report said about bias and discrimination. Echoing the words of the noble Lord, Lord Stunell, I agree that there is an important role for the Equality and Human Rights Commission, alongside the Alan Turing Institute and the CDEI, in getting to grips with how public bodies need to approach algorithmic intelligence.
When it comes to fairness, what do we mean—republican, democratic, libertarian or otherwise, equality of opportunity, equality of outcomes? On the technical conception of fairness there are at least 21 different definitions which computer scientists have come up with, as well as mathematical concepts within this world. What about individual, group or utility fairness and their trade-offs? If we end up with a merely utilitarian conclusion, that will be so desperately disappointing and so dangerous. I wish I could channel my inner noble Baroness, Lady O’Neill of Bengarve, who speaks far more eloquently on this than me.
The concepts and definitions are slippery but the consequences, as we have heard, are absolutely critical—in health, in education, in recruitment, in criminal profiling. We know how to make a success of this. It will come down to the recommendations of the committee’s report. It will come down to the recommendations—and not least the five principles—set out by the Artificial Intelligence Select Committee. Yes, mea culpa, I was a member of that committee, so excellently chaired, I say again, by the noble Lord, Lord Clement-Jones.
We need to consider the approach taken by the EHRC to reasonable adjustments for public bodies and the public sector equality duty; this is really about “CAGE”—"clear, applicable guidance: essential”. The prize is extraordinary. I shall give your Lordships just one example: in health, not even diagnostics but DNA is currently costing the NHS £1 billion. A simple algorithmic solution would mean £1 billion saved and therefore £1 billion that could go into care.
I am neither a bishop nor a boffin but I believe this: if we can harness all the positivity and all the potential of algorithms, of all the elements of the fourth industrial revolution, not only will we be able to make an incredible impact on the public good but I truly believe that we will be able to unite sceptics and evangelists behind ethical AI.
My Lords, I first came into this whole area when I was a Lords Minister in the Cabinet Office seven years ago, when we were struggling with the beginnings of Whitehall going through the digital transformation. I am struck by just how much things have moved since then, mostly in a highly desirable direction, but we are all concerned that we continue to move with the right safeguards and regulations.
I am not an expert, but I have learned a lot from my son-in-law who is a financial quantitative analyst looking for when patterns do not hold as well as when they do, and from my son-in-law who is an systems biologist working on mutations in RNA and DNA, not that far away from the current Chinese virus. So I follow what the experts do without being an expert myself.
I am also struck by how very little the public are aware, and how little Parliament has been involved so far. The noble and learned Lord, Lord Keen, referred the other week to us returning to the “normal relationship” between Parliament and government, by which I think he meant Parliament shutting up more and allowing the Government to get on with things. I hope that is not what will happen in this area, because it is vital that the Government carry Parliament and then the public with them as they go forward.
A study for the Centre for Data Ethics and Innovation by MORI showed very little public awareness of what is going on in the sector. As the public learned, so they got more sceptical; I think the word used was “shocked”. We know that there are major benefits in the public sector from the greater use of artificial intelligence, if introduced with appropriate safeguards and regulations. This is evidence-based policy-making, which is what we are all interested in, so we need to make sure that we get it right and carry the public with the expansion of artificial intelligence.
There is a real danger of provoking a tabloid press campaign against the expansion of AI. We have seen what happened with the campaign against the MMR vaccine and how much credibility that got among the popular media, so transparency, regulation, education and explanation are important.
We need a clear legal framework. In 2012, one of the problems was that different Whitehall departments had different legal frameworks for how they used their data and how far they could share it with other departments. We need a flexible legal framework because, as we manage to do more things with mass data and mass data sharing, we shall need to adapt the framework—another reason why Parliament needs to be actively engaged.
We need ethics training for those in the public sector—and in the private sector interacting with the public sector—using artificial intelligence, so that they are aware of the limitations and potential biases and aware also that human interaction with the data and the algorithms is essential. One of the things that worries me at present, as an avid reader of Dominic Cummings’ blog, is the extent to which he believes that scientists and mathematicians should be allowed to get on with things without anthropologists, sociologists and others saying, “Hang on a minute. It’s not always as simple as you think. Humans often react in illogical ways, and that has to be built into your system.”
My noble friend Lord Stunell talked about public/private interaction. I think we understand that, while we are concentrating here on the proper public sector, one cannot disentangle private contractors and data managers from what goes on in the public sector, so we also need to extend regulation and education to the many bright private suppliers. I had a young man come to see me this afternoon who works for one of these small companies, and I was extremely impressed by how well he understood the issues.
We also need to engage civil society. Having spent a few weeks talking to university research centres, I am very impressed by how on top of this they are. There are some very impressive centres, which we also need to encourage. The richness of the developing expertise within the UK is something which the Government certainly need to encourage and lead.
My noble friend Lord Addington suggested that we may need to put the brakes on. We have to recognise that the pace of change is not going to slow, so we have to adapt and make sure that our regulatory framework adapts. I was pleased to listen to a talk by the director of the Centre for Data Ethics and Innovation hosted by the All-Party Parliamentary Group on Data Analytics last week. It is a very good innovation, but it needs to expand and to have a statutory framework. Is the Minister able to tell us what progress is being made in providing the CDEI with a statutory framework?
There are alternative approaches for the Government to take. One, the Dominic Cummings approach, would be to use speed and impatience in pushing innovation through and dismissing criticism. The second would be to go at all deliberate speed, with careful explanation, clear rules and maximum transparency, carrying Parliament and the public with it. The young man who came to see me this afternoon talked about having digital liberalism or digital authoritarianism—that is the choice.
My Lords, I am only too glad to add my word of thanks to the humble, ordinary, flesh-and-blood noble Lord, Lord Clement-Jones, for our debate this evening. So many points have been raised, many of them the object of concern of more than one contributor to the debate. I am reminded a little of what happened when we had the big bang in the 1980s: finance went global and clever young people knew how to construct products within the financial sector that their superiors and elders had no clue about. Something like that is happening now, which makes it even more important for us to be aware and ready to deal with it.
I take up the point raised by the noble Lord, Lord Browne of Ladyton, about legislation. He said that it had to be flexible; I would add “nimble”. We must have the general principles of what we want to do to regulate this area available to us, but be ready to act immediately—as and when circumstances require it—instead of taking cumbersome pieces of legislation through all stages in both Houses. The movement is much faster than that in the real world. I recognise what has been said about the exponential scale in the advance of all these methodologies and approaches. We heard ample mention of the Nolan principles; I am glad about that.
On the right of explanation, I picked up an example that it is worth reminding ourselves of when we ask what it means to have an explanation of what is happening. It comes from Italy; perhaps other Members will be aware of it too. An algorithm was used to decide into which schools to send state schoolteachers. After some dubious decision-making by the algorithm, teachers had to fight through the courts to get some sort of transparency regarding the instructions that the algorithm had originally been given. Although the teachers wanted access to the source code of the algorithm—the building blocks, with all the instructions —the Italian Supreme Court ruled that appropriate transparency constituted only an explanation of its function and the underlying legal rules. In other words, it did not give the way in which the method was evolved or the algorithm formed; it was just descriptive rather than analytical. I believe that, if we want transparency, we have to make available the kind of its nuts-and-bolts aspects that lead to the algorithms that are then the object of our concern.
On accountability, who can call the shots? The noble Baroness, Lady Rock, was one of those who mentioned that. I have been reading, because it is coming up, the Government’s online harms response and the report of the House of Commons Science and Technology Committee. I am really in double-Dutch land with it all as I look at how they interleave with each other. Each says things separately and yet together. In the report that I think we will be looking at tomorrow, it is recommended that we should continue to use the good offices of the ICO to cover the way in which the online harms process is taken forward. We have also heard that that may be the appropriate body to oversee all the things that we have been discussing. While the Information Commissioner’s Office is undoubtedly brilliant and experienced, is it really the only regulator that can handle this multiplicity of tasks? Is there a need now to look at perhaps adding something in to recognise the speed at which these things are developing—to say nothing of appointing, as the report suggests, a Minister with responsibility for this area?
I am so glad to see the noble Lord, Lord Ashton, arrive in his new guise as Chief Whip, because, in a previous incarnation, we were eyeball to eyeball like this. He reminds me of course that it was on the Data Protection Bill, as it then was—an enormous, composite, huge thing—that I cut my teeth, swimming against the tide and wondering whether I would drown. It was said then that the Centre for Data Ethics and Innovation was something we should aim at. It needs to happen. Here we are, two years later, and it still has not happened; it is still an aspiration. We must move forward to a competent body that can look at the ethical dimensions of these developments. It must have teeth and it must make recommendations, and it must do so speedily. On that, I am simply repeating what others have said.
Let me finish with one word—it will go into Hansard; it will go against my reputation and I will be a finished man after saying it. When I put my computer on with certain of the things that I do—for example, the Guardian quick crossword, which is part of my morning devotions—the advertising that comes up presumably has been put there by an algorithm. But it suggests that I want to buy women’s underwear. I promise noble Lords that I have no experience in that area at all, and I want to know, as a matter of transparency, what building blocks have gone into the algorithm that has told my computer to interest me in these rather recondite aspects of womenswear.
My Lords, I am lost for words. I am really not sure how one follows that disclosure.
I echo other noble Lords in thanking the noble Lord, Lord Clement-Jones, for securing this important and interesting debate. I think that I am with the noble Lord, Lord Browne of Ladyton, in being the only outcasts who were not on any of the committees—the noble Lord, Lord Griffiths, indicates that he was not either—so we are an elite club.
The noble Lord, Lord Clement-Jones, rightly highlighted the widespread and rapidly growing use of algorithms, which underlines the importance of this debate. As noble Lords are aware, the UK is a world leader in relation to artificial intelligence, in terms of attracting investment, attracting talent and, crucially, in thinking through the practical and ethical challenges that the technology presents.
While driving forward innovation, we need to ensure that we maintain the public’s trust in how decisions are made about them and how their data is used, thus ensuring fairness, avoiding bias and offering transparency and accountability—which are all aspirations that noble Lords have expressed.
We want to maximise the potential that artificial intelligence offers, while ensuring that any negative implications of its use are mitigated. The Government have introduced a number of measures and interventions to ensure that we maintain public trust, something underlined by my noble friend Lady Rock, in the use of these technologies in the public sector. These include setting up the Centre for Data Ethics and Innovation; developing a data ethics framework and a guide to using artificial intelligence; and creating a draft set of guidelines for AI procurement. To be successful, we need practice to become standardised, consistent and accountable. If so, public services have the potential, as my noble friend Lord Holmes pointed out, to become much fairer than they have been historically. I think it was the noble Lord, Lord Wallace, who said—forgive me if I have got this wrong—that we have to realise that potential.
Several noble Lords talked about the report from the Committee on Standards in Public Life, Artificial Intelligence and Public Standards. The Government have noted the recommendations on greater transparency by public bodies in the use of algorithms; new guidance to ensure that algorithmic decision-making abides by equalities law, which obviously applies in just the same way as in any other context; the creation of a single, coherent regulatory framework to govern this area; the formation of a statutory body to advise existing regulators on relevant issues; and proper routes of redress for citizens who feel that decisions are unfair. The Government will respond to these recommendations in due course, and that may offer another opportunity to reflect on these issues.
We also welcome the committee’s recommendation relating to the Centre for Data Ethics and Innovation. We were very pleased to see the committee’s endorsement of the centre’s important role in identifying gaps in the regulatory landscape. We are discussing with the centre the statutory powers it thinks it will need—a point made by the noble Lord, Lord Giddens—to deliver against those terms of reference. The right reverend Prelate the Bishop of Oxford expressed the need for a set of principles and an ethical basis for all our work. Noble Lords will be aware of the development of the data ethics framework, which includes a number of those principles. We are currently working on refreshing that framework to make it as up to date as possible for public servants who work with data.
The Committee on Standards in Public Life report, and others, have raised the issue of multiple frameworks. The Government are currently looking into developing a landing page on GOV.UK to enable users to assess the different frameworks and direct them to the one that is most appropriate and relevant to their needs. A number of noble Lords raised the importance of any framework staying agile and nimble. That is absolutely right. There is a lot more work to do on this, including looking at defining high-stakes algorithms and thinking through the mechanisms to ensure that decisions are made in an objective way. In that agility, I think all noble Lords would agree that we want to stay anchored to those key ethical principles, including, of course, the Nolan principles.
One of the foundations of our approach is the work being done on having a clear ethical framework, but we also need sound ways of implementing in practice the principles expressed in the framework. Part of our work in trying to increase transparency and accountability in the use of algorithms in AI has been the collaboration between the Office for Artificial Intelligence and the World Economic Forum’s Centre for the Fourth Industrial Revolution to codesign guidelines for AI procurement to unlock AI adoption in the public sector.
We published the draft guidelines for consultation in September 2019. The Office for Artificial Intelligence is now collaborating with other government departments to test those findings and proposals and has launched a series of pilots of the guidelines, including with four or five major government departments. Following the pilot and consultation phase, we will update the guidelines and work to design what only government could call an “AI procurement in a box” toolkit to provide other Governments and our public sector agencies with the tools they need to have the highest standards of procurement.
In an effort to bring coherence across central government departments, my honourable friend the Minister for Digital and Broadband and my right honourable friend the Minister for Universities, Science, Research and Innovation wrote a letter earlier this week to all Secretaries of State reminding them of and highlighting the work of the AI Council and the support it can give government departments.
The noble Lord, Lord Giddens, asked about the Alan Turing Institute. The Government value its work greatly, particularly some of the work being done around skills development, which is so critical in this field.
I think every noble Lord spoke about algorithmic bias. My noble friend Lord Taylor spoke about facial recognition and issues particularly among police forces. Other noble Lords referred to the work of DWP and child protection agencies. It is important that our work in trying to avoid bias—I think all noble Lords recognise that bias exists potentially within algorithms but also in more traditional decision-making—is guided by independent and expert advice. Again, we are grateful to the Centre for Data Ethics and Innovation, which as part of its current work programme is conducting a review into the potential for bias, looking particularly at policing, financial services, recruitment—this was referred to by the noble Lord, Lord Addington; I note how lucky it is that my noble friends Lady Rock and Lady Chisholm and I managed to beat the recruitment algorithm to get here—and local government. These sectors were all selected because they involve significant decisions being made about individuals. The report will be published in March and we very much look forward to its recommendations, which will inform our work in future.
I fear that I will have to write on some of the points raised, but I will do my best to cover as many as I can in the remaining time. The noble Lord, Lord St John, asked about having a duty on public bodies to declare where they are using algorithms. We hope the Centre will be looking at all of these things in the transparency aspect of its work. We are also currently reviewing the future work plan with the Centre, and obviously a number of the issues around accountability will be discussed as part of that.
In closing, I will go back to two points. One is on the potential of the use of artificial intelligence, which PricewaterhouseCoopers has estimated could contribute almost $16 trillion to the global economy; obviously the UK is one of the top three countries providing that, so that would be a huge boost to our economy. However, I also go back to what the right reverend Prelate the Bishop of Oxford said about what it means to be human. We can harness that potential in a way that enhances, rather than erodes, our humanity.