(2 years, 9 months ago)
Commons ChamberI thank my hon. Friend for the tireless work that he put into the Committee that scrutinised the Domestic Abuse Bill. I am delighted to confirm that all tier 1 local authorities have set up domestic abuse local partnership boards, in line with the Act, to provide them with advice on the provision of the specialist services that are such an important part of that landmark Act. I genuinely encourage all Members across the House to engage with those boards to see what they are doing for their local communities and how they are helping their constituents.
I welcome the Secretary of State’s defence of free speech earlier today, but the truth is that free speech is under attack in our courts. Tom Burgis is appearing in court today against oligarchs who are seeking to silence him. When will the Secretary of State bring forward a defence against strategic lawsuits against public participation—SLAPPs? If we want to live in truth, we need SLAPP-back laws now.
(6 years, 8 months ago)
Public Bill CommitteesThank you, Mr Streeter, and what a wonderful birthday present it is to be serving on the Committee.
It is a joy, actually, to be able to agree with the Opposition on the principle that equality applies not only to decisions made by human beings or with human input, but to decisions made solely by computers and algorithms. On that, we are very much agreed. The reason that we do not support the new clauses is that we believe that the Equality Act already protects workers against direct or indirect discrimination by computer or algorithm-based decisions. As the right hon. Member for Birmingham, Hodge Hill rightly said, the Act was passed with cross-party consensus.
The Act is clear that in all cases, the employer is liable for the outcome of any of their actions, or those of their managers or supervisors, or those that are the result of a computer, algorithm or mechanical process. If, during a recruitment process, applications from people with names that suggest a particular ethnicity were rejected for that reason by an algorithm, the employer would be liable for race discrimination, whether or not they designed the algorithm with that intention in mind.
The right hon. Gentleman placed a great deal of emphasis on advertising and, again, we share his concerns that employers could seek to treat potential employees unfairly and unequally. The Equality and Human Rights Commission publishes guidance for employers to ensure that there is no discriminatory conduct and that fair and open access to employment opportunities is made clear in the way that employers advertise posts.
The same principle applies in the provision of services. An automated process that intentionally or unintentionally denies a service to someone because of a protected characteristic will lay the service provider open to a claim under the Act, subject to any exceptions.
I am grateful to the Minister for giving way, not least because it gives me the opportunity to wish her a happy birthday. Could she remind the Committee how many prosecutions there have been for discriminatory advertising because employers chose to target their adverts?
If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.
The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.
When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.
We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.
I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.
We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.
We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.
Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.
I think it was in “Candide” that Voltaire introduced us to the word “Panglossian”, and we have heard a rather elegant and Panglossian description of a perfect world in which all is fine in the labour market. I am much more sceptical than the Minister. I do not think the current law is sufficiently sharp, and I am concerned that the consequence of that will be injustice for our constituents.
The Minister raised a line of argument that it is important for us to consider. The ultimate test of whether the law is good enough must be what is actually happening out there in the labour market. I do not think it is good enough; she thinks it is fine. On the nub of the argument, a few more facts might be needed on both sides, so we reserve the right to come back to the issue on Report. This has been a useful debate. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 13
Review of Electronic Commerce (EC Directive) Regulations
“(1) The Secretary of State shall lay before both Houses of Parliament a review of the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of personal data.
(2) A review under subsection (1) shall be laid before Parliament by 31 January 2019.”—(Liam Byrne.)
This new clause would order the Secretary of State to review the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of data and lay that review before Parliament before 31 January 2019.
Brought up, and read the First time.
Yes. My hon. Friend has done an extraordinary job of exposing that minor scandal. I am surprised that it has not had more attention in the House, but hopefully once the Bill has passed it is exactly the kind of behaviour that we can begin to police rather more effectively.
I am sure that Ministers will recognise that there is a need for this. No doubt their colleagues in the Department for Education are absolutely all over it. I was talking to a headteacher in the Minister’s own constituency recently—an excellent headteacher, in an excellent school, who is a personal friend. The horror with which headteachers regard the arrival of the GDPR is something to behold. Heaven knows, our school leaders and our teachers have enough to do. I call on Ministers to make their task, their lives, and their mission that bit easier by accepting the new clause.
Our schools handle large volumes of sensitive data about the children they educate. Anyone who has any involvement with the education system, either personally through their families, on their mobile phone apps, or in a professional capacity as constituency MPs, is very conscious of the huge responsibilities that school leaders have in handling that data properly and well, and in accordance with the law. As data controllers in their own right, schools and other organisations in the education system will need to ensure that they have adequate data-handling policies in place to comply with their legal obligations under the new law.
Work is going on already. The Department for Education has a programme of advice and education for school-leaders, which covers everything from blogs, a guidance video, speaking engagements, and work to encourage system suppliers to be proactive in helping schools to become GDPR-compliant. Research is also being undertaken with parents about model privacy notices that will help schools to make parents and pupils more aware of the data about children used in the sector. The Department for Education is also shaping a toolkit that will bring together various pieces of guidance and best practice to address the specific needs of those who process education data. In parallel, the Information Commissioner has consulted on guidance specifically addressing issues about the fair and lawful processing of children’s data. Everyone is very alive to the issue of protecting children and their data.
At this point, the Government want to support the work that is ongoing—already taking place—and the provisions on guidance that are already in the Bill. Our concern is that legislating for a code now could be seen as a reason for schools to wait and see, rather than continuing their preparations for the new law. But it may be that in due course the weight of argument swings in favour of a sector-specific code of practice. That can happen. It does not have to be in the Bill. It can happen because clause 128 provides that the Secretary of State may require the Information Commissioner to prepare additional codes of practice for the processing of personal data, and the commissioner can issue further guidance under her own steam, using her powers under article 57 of the GDPR, without needing any direction from the Secretary of State.
I hope that the ongoing work reassures the right hon. Gentleman and that he will withdraw the new clause at this stage.
I am reassured by that and I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 17
Personal data ethics advisory board and ethics code of practice
‘(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board (“the board”).
(2) The board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are—
(a) to monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;
(b) to monitor the protection of the individual and collective rights and interests of data subjects in relation to their personal data;
(c) to ensure that trade-offs between the rights of data subjects and the use of management of personal data are made transparently, inclusively, and with accountability;
(d) to seek out good practices and learn from successes and failures in the use and management of personal data;
(e) to enhance the skills of data subjects and controllers in the use and management of personal data.
(3) The board must work with the Commissioner to prepare a data ethics code of practice for data controllers, which must—
(a) include a duty of care on the data controller and the processor to the data subject;
(b) provide best practice for data controllers and processors on measures, which in relation to the processing of personal data—
(i) reduce vulnerabilities and inequalities;
(ii) protect human rights;
(iii) increase the security of personal data; and
(iv) ensure that the access, use and sharing personal data is transparent, and the purposes of personal data processing are communicated clearly and accessibly to data subjects.
(4) The code must also include guidance in relation to the processing of personal data in the public interest and the substantial public interest.
(5) Where a data controller or processor does not follow the code under this section, the data controller or processor is subject to a fine to be determined by the Commissioner.
(6) The board must report annually to the Secretary of State.
(7) The report in subsection (6) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—
(a) monitoring and evaluating the use and management of personal data;
(b) sharing best practice and setting standards for data controllers; and
(c) clarifying and enforcing data protection rules.
(8) The Secretary of State must lay the report made under subsection (6) before both Houses of Parliament.
(9) The Secretary of State must, no later than one year after the day on which this Act receives Royal Assent, lay before both Houses of Parliament draft regulations in relation to the functions of the Personal Data Ethics Advisory Board as listed in subsections (2), (3), (4), (6) and (7) of this section.
(10) Regulations under this section are subject to the affirmative resolution procedure.’—(Darren Jones.)
This new clause would establish a statutory basis for a Data Ethics Advisory Board.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
I will touch on this new clause only very briefly, because I hope the Minister will put my mind at rest with a simple answer. For some time, there has been concern that the way in which data collected by the police through automatic number plate recognition technology is not adequately ordered, organised or policed by a code of practice. A code of practice is probably required to put the police well and truly within the boundaries of the Police and Criminal Evidence Act 1984, the Data Protection Act 1998 and the Bill.
With this new clause, we are basically asking the Secretary of State to issue a code of practice in connection with the operation by the police of ANPR systems under subsection (1), and we ask that it conform to section 67 of the Police and Criminal Evidence Act 1984. I hope the Minister will just say that a code of practice is on the way so we can safely withdraw the new clause.
I hope Committee members have had the chance to see my response to the questions of the hon. Member for Sheffield, Heeley on Tuesday about ANPR, other aspects of surveillance and other types of law enforcement activity.
I assure the right hon. Member for Birmingham, Hodge Hill that ANPR data is personal data and is therefore caught by the provisions of the GDPR and the Bill. We recognise the need to ensure the use of ANPR is properly regulated. Indeed, ANPR systems are governed by not one but two existing codes of practice. The first is the code issued by the Information Commissioner, exercising her powers under section 51 of the Data Protection Act 1998. It is entitled “In the picture: A data protection code of practice for surveillance cameras and personal information”, and was published in June 2017. It is clear that it covers ANPR. It also refers to data protection impact assessments, which we debated last week. It clearly states that where the police and others use or intend to use an ANPR system, it is important that they
“undertake a privacy impact assessment to justify its use and show that its introduction is proportionate and necessary.”
The second code is brought under section 29 of the Protection of Freedoms Act 2012, which required the Secretary of State to issue a code of practice containing guidance about surveillance camera systems. The “Surveillance camera code of practice”, published in June 2013, already covers the use of ANPR systems by the police and others. It sets out 12 guiding principles for system operators. Privacy is very much a part of that. The Protection of Freedoms Act established the office of the Surveillance Camera Commissioner, who has a number of statutory functions in relation to the code, including keeping its operation under review.
In addition, a published memorandum of understanding between the Surveillance Camera Commissioner and the Information Commissioner sets out how they will work together. We also have the general public law principles of the Human Rights Act 1998 and the European convention on human rights. I hope that the two codes I have outlined, the Protection of Freedoms Act and the Human Rights Act reassure the right hon. Gentleman, and that he will withdraw his new clause.
I am indeed mollified. I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
New Clause 21
Targeted dissemination disclosure notice for third parties and others (No. 2)
“In Schedule 19B of the Political Parties, Elections and Referendums Act 2000 (Power to require disclosure), after paragraph 10 (documents in electronic form) insert—
10A (1) This paragraph applies to the following organisations and individuals—
(a) a recognised third party (within the meaning of Part 6);
(b) a permitted participant (within the meaning of Part 7);
(c) a regulated donee (within the meaning of Schedule 7);
(d) a regulated participant (within the meaning of Schedule 7A);
(e) a candidate at an election (other than a local government election in Scotland);
(f) the election agent for such a candidate;
(g) an organisation or a person notified under subsection 2 of this section;
(h) an organisation or individual formerly falling within any of paragraphs (a) to (g); or
(i) the treasurer, director, or another officer of an organisation to which this paragraph applies, or has been at any time in the period of five years ending with the day on which the notice is given.
(2) The Commission may under this paragraph issue at any time a targeted dissemination disclosure notice, requiring disclosure of any settings used to disseminate material which it believes were intended to have the effect, or were likely to have the effect, of influencing public opinion in any part of the United Kingdom, ahead of a specific election or referendum, where the platform for dissemination allows for targeting based on demographic or other information about individuals, including information gathered by information society services.
(3) This power shall not be available in respect of registered parties or their officers, save where they separately and independently fall into one or more of categories (a) to (i) of sub-paragraph (1).
(4) A person or organisation to whom such a targeted dissemination disclosure notice is given shall comply with it within such time as is specified in the notice.”
This new clause would amend the Political Parties, Elections and Referendums Act 2000 to allow the Electoral Commission to require disclosure of settings used to disseminate material where the platform for dissemination allows for targeting based on demographic or other information about individuals.—(Liam Byrne.)
Brought up, and read the First time.
(6 years, 8 months ago)
Public Bill CommitteesWe are rattling through the Bill this morning and will soon reach clause 109, to which we have tabled some amendments. Clause 96, within chapter 3 of part 4, on intelligence services processing, touches on the right not to be subject to automated decision making. I do not want to rehearse the debate that we shall have later, but I think that this is the appropriate point for an explanation from the Minister. Perhaps she will say something about the kind of administration that the clause covers, and its relationship, if any—there may not be one, but it is important to test that question—to automated data-gathering by our intelligence services abroad, and the processing and use of that data.
The specific instance that I want to take up concerns the fact that about 700 British citizens have gone to fight in foreign conflicts—for ISIS in particular. The battery of intelligence-gathering facilities that we have allows us to use remote data-sensing to detect, track and monitor them, and to assemble pictures of their patterns of life and behaviour. It is then possible for our intelligence services to do stuff with those data and patterns, such as transfer them to the military or to foreign militaries in coalitions of which we are a member. For the benefit of the Committee, will the Minister spell out whether the clause, and potentially clause 97, will bite on that kind of capability? If not, where are they aimed?
An intelligence services example under clause 96 would be a case where the intelligence services wanted to identify a subject of interest who might have travelled to Syria in a certain time window and where the initial selector was age, because there was reliable reporting that the person being sought was a certain age. The application of the age selector would produce a pool of results, and a decision may be taken to select that pool for further processing operations, including the application of other selectors. That processing would be the result of a decision taken solely on the basis of automated processing.
I do not think the clause actually says anything about age selection. How do we set boundaries around the clause? Let us say that minors—people under the age of 18—want to travel to Syria or some other war zone. Is the Minister basically saying that the clause will bite on that kind of information and lead to a decision chain that results in action to intervene? If that is the case, will she say a little more about the boundaries around the use of the clause?
The right hon. Gentleman asked me for an example and I provided one. Age is not in the clause because the Government do not seek in any way to create burdens for the security services when they are trying to use data to protect this country. Given his considerable experience in the Home Office, he knows that it would be very peculiar, frankly, for age to be listed specifically in the clause. The clause is drafted as it is, and I remind him that it complies with Council of Europe convention 108, which is an international agreement.
The point is that the clause does create a burden. It does not detract from a burden; it creates an obligation on intelligence services to ensure that there is not automatic decision making. We seek not to add burdens, but to question why the Minister is creating them.
The clause complies with Council of Europe convention 108. I do not know whether I can say any more.
I think we have come to a natural conclusion.
Question put and agreed to.
Clause 96 accordingly ordered to stand part of the Bill.
Clause 97
Right to intervene in automated decision-making
Amendments made: 41, in clause 97, page 56, line 34, leave out “21 days” and insert “1 month”.
Clause 97(4) provides that where a controller notifies a data subject under Clause 97(3) that the controller has taken a decision falling under Clause 97(1) (automated decisions required or authorised by law), the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.
Amendment 42, in clause 97, page 56, line 39, leave out “21 days” and insert “1 month”.—(Victoria Atkins.)
Clause 97(5) provides that where a data subject makes a request to a controller under Clause 97(4) to reconsider or retake a decision based solely on automated processing, the controller has 21 days to respond. This amendment extends that period to one month.
Clause 97, as amended, ordered to stand part of the Bill.
Clause 98
Right to information about decision-making
Question proposed, That the clause stand part of the Bill.
This is a vexed and difficult area. The subject of the clause is the right to information about decision making, which is very difficult when it comes to the intelligence services, and I have had experiences, as have others I am sure, of constituents who come along to an advice bureau and claim to have been subject either to intelligence services investigation or, in some cases, to intelligence services trying to recruit them. Sometimes—this is not unknown—an individual’s immigration status might be suspect. I had one of these cases about five or six years ago, where the allegation was that the intelligence services were conspiring with the UK Border Agency and what at that time was the Identity and Passport Service to withhold immigration documents to encourage the individual to become a source. The challenge for Members of Parliament trying to represent such individuals is that they will get a one-line response when they write to the relevant officials to say, “I am seeking to represent my constituent on this point.”
A right to information about decision-making will be created under clause 98. I ask the Minister, therefore, when dealing with very sensitive information, how is this right going to be exercised and who is going to be the judge of whether that right has been fulfilled satisfactorily? There is no point approving legislation that is superfluous because it will have no effect in the real world. The clause creates what looks like a powerful new right for individuals to request information about decisions taken by the intelligence agencies, which might have a bearing on all sorts of things in their lives. Will the Minister explain how, in practice, this right is to become a reality?
If I may give an example, where a terrorist suspect is arrested and believes he is the subject of MI5 surveillance, revealing to them whether they were under surveillance and the process by which the suspect was identified as a potential terrorist would clearly aid other terrorists in avoiding detection. The exercise of the right is subject to the operation of the national security exemption, which was debated at length last week. It might be that, in an individual case, the intelligence services need to operate the “neither confirm nor deny” principle, and that is why the clause is drafted as it is.
The clause is drafted in the opposite way. Subsection (1)(b) says that
“the data subject is entitled to obtain from the controller, on request, knowledge of the reasoning underlying the processing.”
In other words, the data subject—in this case, the individual under surveillance—has the right to obtain from the controller, in the hon. Lady’s example of the intelligence agencies, knowledge of the reasoning underlying the way their data was processed.
Let us take, for example, a situation where CCTV footage was being captured at an airport or a border crossing and that footage was being run through facial recognition software, enabling special branch officers to intervene and intercept that individual before they crossed the border. That is an example of where information is captured and processed, and action then results in an individual, in this case, being prevented from coming into the country.
I have often had cases of constituents who have come back from Pakistan or who might have transitioned through the middle east, perhaps Dubai, and they have been stopped at Birmingham airport because special branch officers have said their name is on a watch list. Watch lists are imperfect—that is probably a fairly good description. They are not necessarily based on the most reliable and up-to-date information, but advances in technology allow a much broader and more wide-ranging kind of interception to take place at the border. If we are relying not on swiping someone’s passport and getting a red flag on a watch list but on processing data coming in through CCTV and running it through facial recognition software, that is a powerful new tool in the hands of the intelligence agencies. Subsection (1)(b) will give one of my constituents the right to file a request with the data controller—presumably, the security services—and say, “Look, I think your records are wrong here. You have stopped me on the basis of facial recognition software at Birmingham airport; I want to know the reasoning behind the processing of the data.”
If, as the Minister says, the response from the data controller is, “We can neither confirm nor deny what happened in this case,” then, frankly, the clause is pretty nugatory. Will the Minister give an example of how the right is going to be made a reality? What are the scenarios in which a constituent might be able to exercise this right? I am not interested in the conventions and international agreements this happy clause tends to agree with, but I would like to hear a case study of how a constituent could exercise this right successfully.
The right hon. Gentleman says he is not interested in conventions and so on, but I am afraid that is the legal framework within which Parliament and this country have to act. The clause confers—as do the other clauses in chapter 3—rights upon citizens, but those rights are subject, as they must be, to the national security exemption set out in chapter 6, clause 110.
I am slightly at a loss as to where the right hon. Gentleman wishes to go with this. I am not going to stand here and dream up scenarios that may apply. The rights and the national security exemption are set out in the Bill; that is the framework we are looking at, and that is the framework within which the security services must operate. Of course one has a duty to one’s constituents, but that is balanced with a duty to one’s country. This is precisely the section of the Bill that is about the balance between the rights of our citizens and the absolute necessity for our security services to protect us and act in our interests when they are required to do so.
I am not asking the Minister to dream up a scenario in Committee. All good Ministers understand every single dimension of a clause they are required to take through the House before they come anywhere near a Committee, because they are the Bill Minister.
We are not debating here whether the security services have sufficient power; we had that debate earlier. We are talking about a power and a right that are conferred on data subjects under subsection (1)(b). I am slightly concerned that the Minister, who is responsible for this Bill and this matter of policy, has not been able to give us a well-rehearsed scenario, which presumably she and her officials will have considered before the Bill came anywhere near to being drafted. How will this right actually be exercised by our constituents? It could be that the Committee decides, for example, that the rights we are conferring on the data subject are too sweeping. We might be concerned that there are insufficient safeguards in place for the intelligence agencies to do their jobs. This is a specific question about how data subjects, under the clause, are going to exercise their power in a way that allows the security services to do their job. That is not a complicated request; it is a basic question.
As I say, the framework is set out in the Bill, and the exemption exists in the Bill itself. I have already given an example about a terror suspect. With respect, I am not going to enter into this debate about the right hon. Gentleman’s constituent—what he or she might have requested, and so on. The framework is there; the right is there, balanced with the national security exemption. I am not sure there is much more I can add.
The Minister says she does not want to enter into a debate. I kindly remind her that she is in a debate. The debate is called—
I am grateful, Mr Hanson, for that complete clarity. This is the debate that we are having today: how will clause 98(1)(b) become a reality? It creates quite powerful rights for a data subject to seek information from the intelligence agencies. I gave an example from my constituency experience of how the exercise of this right could run into problems.
All I ask of the Minister responsible for the Bill and this area of policy, who has thought through the Bill with her officials and is asking the Committee to agree the power she is seeking to confer on our constituents, and who will have to operate the policy in the real world after the Bill receives Royal Assent, is that she give us a scenario of how the rights she is conferring on a data subject will function in the real world.
However, Mr Hanson, I think we might have exhausted this debate. It is disappointing that the Minister has not been able to come up with a scenario. Perhaps she would like to intervene now to give me an example.
Part 4 sets out a number of rights of data subjects, clause 98 being just one of them. This part of the Bill reflects the provisions of draft modernised convention 108, which is an international agreement, and the Bill faithfully gives effect to those provisions. A data subject wishing to exercise the right under clause 98 may write to that effect to the Security Service, which will then either respond in accordance with clause 98 or exercise the national security exemption in clause 110. That is the framework.
That is probably about as much reassurance as the Committee is going to get this afternoon. It is not especially satisfactory or illuminating, but we will not stand in the way and we will leave the debate there, Mr Hanson.
Before I start, I want to clarify what the hon. Gentleman has just said about adequacy decisions. Canada does have an adequacy decision from the EU for transfers to commercial organisations that are subject to the Canadian Personal Information Protection and Electronic Documents Act. I am not sure that security services are covered in that adequacy decision, but it may be that we will get assistance elsewhere.
As the right hon. Member for Birmingham, Hodge Hill is aware, amendments 159, 160 and new clause 14 were proposed by a campaigning organisation called Reprieve in its recent briefing on the Bill. They relate to concerns about the sharing of personal data with the US and seek to apply the data sharing protections designed specifically for law enforcement data processing, provided for in part 3 of the Bill, to processing by the intelligence services, provided for in part 4. That is, they are seeking to transpose all the law enforcement measures into the security services. However, such safeguards are clearly not designed for, and do not provide, an appropriate or proportionate basis for the unique nature of intelligence services processing, which we are clear is outside the scope of EU law.
Before I get into the detail of these amendments, it is important to put on record that the international transfer of personal data is vital to the intelligence services’ ability to counter threats to national security. Provision of data to international partners bolsters their ability to counter threats to their security and that of the UK. In a globalised world, threats are not necessarily contained within one country, and the UK cannot work in isolation. As terrorists do not view national borders as a limit to their activities, the intelligence services must be in a position to operate across borders and share information quickly—for example, about the nature of the threat that an individual poses—to protect the UK.
In the vast majority of cases, intelligence sharing takes place with countries with which the intelligence services have long-standing and well-established relationships. In all cases, however, the intelligence services apply robust necessity and proportionality tests before sharing any information. The inherent risk of sharing information must be balanced against the risk to national security of not sharing such information.
Will the Minister tell us more about the oversight and scrutiny for the tests that she has just set out that the intelligence services operate? Perhaps she will come on to that.
I am coming on to that.
Any cross-border sharing of personal data must be consistent with our international obligations and be subject to appropriate safeguards. On the first point, the provisions in clause 109 are entirely consistent with the requirements of the draft modernised Council of Europe data protection convention—convention 108—on which the preventions of part 4 are based. It is pending international agreement.
The provisions in the convention are designed to provide the necessary protection for personal data in the context of national security. The Bill already provides that the intelligence services can make transfers outside the UK only when necessary and proportionate for the limited purposes of the services’ statutory functions, which include the protection of national security; for the purpose of preventing or detecting serious crime; or for the purpose of criminal proceedings.
In addition, on the point the right hon. Gentleman just raised, the intelligence services are already under statutory obligations in the Security Service Act 1989 and the Intelligence Services Act 1994 to ensure that no information is disclosed except so far as is necessary for those functions or purposes. All actions by the intelligence services, as with all other UK public authorities, must comply with international law.
It is absolutely vital. What is more, not only is there a framework in the Bill for overseeing the work of the intelligence services, but we have the added safeguards of the other legislation that I set out. The burden on the security services and the thresholds they have to meet are very clear, and they are set out not just in the Bill but in other statutes.
I hope that I have provided reassurance that international transfers of personal data by the intelligence services are appropriately regulated both by the Bill, which, as I said, is entirely consistent with draft modernised convention 108 of the Council of Europe—that is important, because it is the international agreement that will potentially underpin the Bill and agreements with our partners and sets out agreed international standards in this area—and by other legislation, including the 2016 Act. We and the intelligence services are absolutely clear that to attempt to impose, through these amendments, a regime that was specifically not designed to apply to processing by the intelligence services would be disproportionate and may critically damage national security.
I am sure that it is not the intention of the right hon. Member for Birmingham, Hodge Hill to place unnecessary and burdensome obstacles in the way of the intelligence services in performing their crucial function of safeguarding national security, but, sadly, that is what his amendments would do. I therefore invite him to withdraw them.
I am grateful to the Minister for that explanation and for setting out with such clarity the regime of oversight and scrutiny that is currently in place. However, I have a couple of challenges.
I was slightly surprised that the Minister said nothing about the additional risks created by the change in rules of engagement by the United States. She rested some of her argument on the Security Services Act 1989 and the Intelligence Services Act 1994, which, as she said, require that any transfers of information are lawful and proportionate. That creates a complicated set of ambiguities for serving frontline intelligence officers, who have to make fine judgments and, in drafting codes of practice, often look at debates such as this one and at the law. However, the law is what we are debating. Where the Bill changed the law to create a degree of flexibility, it would create a new risk, and that risk would be heightened by the change in the rules of engagement by one of our allies.
The Minister may therefore want to reflect on a couple of points. First, what debate has there been about codes of practice? Have they changed given the increased surveillance capacity that we have because of the development of our capabilities? How have they changed in the light of the new rules of engagement issued by President Trump?
Yes, and it is not just me—the Court of Appeal is arguing that. The Court of Appeal’s summary in 2013 was that there was a risky legal ambiguity. Its conclusion that it is certainly not clear that UK personnel are immune from criminal liability for their involvement in these programmes is a concern for us all. The Joint Committee on Human Rights reflected on that in 2016, and it concluded pretty much the same thing:
“In our view, we owe it to all those involved in the chain of command for such uses of lethal force…to provide them with absolute clarity about the circumstances in which they will have a defence against any possible future criminal prosecution, including those which might originate from outside the UK.”
This is not a theoretical legal threat to our armed forces and intelligence agencies; this is something that the Court of Appeal and the Joint Committee on Human Rights have expressed worries about.
The new powers and capabilities of our intelligence agencies arguably create the need for greater levels of oversight. This is a pressing need because of the operational policy of one of our allies. We owe it to our armed forces and intelligence agencies to ensure a regime in which they can take clear, unambiguous judgments where possible, and where they are, beyond doubt, safe from future legal challenge. It is not clear to me that the safeguards that the Minister has set out meet those tests.
Perhaps the Minister will clarify one outstanding matter, about convention 108, on which she rested much of her argument. Convention 108 is important. It was written in 1981. The Minister told the Committee that it had been modernised, but also said that that was in draft. I should be grateful for clarification of whether the United Kingdom has signed and is therefore bound by a modernised convention that is currently draft.
I am happy to clarify that. Convention 108 is in the process of being modernised by international partners. I have made it clear, last week and this week, that the version in question is modernised, and is a draft version; but it is the one to which we are committed, not least because the Bill reflects its provisions. Convention 108 is an international agreement and sets the international standards, which is precisely why we are incorporating those standards into the Bill.
I know that the Leader of Her Majesty’s Opposition appears to be stepping away from the international community, over the most recent matters to do with Russia, but the Bill and convention—[Interruption.] Well, he is. However, convention 108 is about stepping alongside our international partners, agreeing international standards and putting the thresholds into legislation. The right hon. Gentleman keeps talking about the need for legislation fit for the world we live in today; that is precisely what convention 108 is about.
Order. The right hon. Member for Birmingham, Hodge Hill indicates that this is an intervention. I thought he had sat down and wanted the Minister to respond. However, if it is an intervention, it is far too long.
I am grateful. Some of us in this House have been making the argument about the risk from Russia for months, and the permissive environment that has allowed the threats to multiply is, I am afraid, the product of much of the inattention of the past seven years.
On the specific point about convention 108, I am glad that the Minister has been able to clarify the fact that it is not operational.
I will give way to the Minister in a moment. The convention was written in 1981. Many people in the Government have argued in the past that we should withdraw not only from the European Union but from the European convention on human rights and therefore also the Council of Europe.
I did not say it was Government policy. I said that there are people within the Administration, including the Secretary of State for Environment, Food and Rural Affairs, who have made the argument for a British Bill of Rights that would remove Britain from the European convention on human rights and, therefore, the Council of Europe. I very much hope that that ambiguity has been settled and that the policy of the current Government will remain that of the Conservative party from now until kingdom come; but the key point for the Committee is that convention 108 is in draft. The modernisation is in draft and is not yet signed. We have heard an express commitment from the Minister to the signing of the thing when it is finalised. We hope that she will remain in her position, to ensure that that will continue to be Government policy; but the modernised version that has been drafted is not yet a convention.
(6 years, 8 months ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship, Mr Streeter. Clause 26 creates an exemption for certain provisions in the Bill only if that exemption is required for the purpose of safeguarding national security or for defence purposes. Where processing does not meet these tests, the exemption cannot apply. It is possible to exempt from most but not all the data protection principles the rights of data subjects, certain obligations on data controllers and processors, and various enforcement provisions, where required to safeguard national security or for defence purposes. In relation to national security, the exemption mirrors the existing national security exemption provided for in section 28 of the 1998 Act. The statutory framework has long recognised that the proportionate exemptions from the data protection principles and the rights of data subjects are necessary to protect national security. The Bill does not alter that position.
The exemption for defence purposes is intended to ensure the continued protection, security and capability of our armed forces and of the civilian staff who support them—not just their combat effectiveness, to use the outdated language of the 1998 Act. In drafting this legislation, we concluded that this existing exemption was too narrow and no longer adequately captured the wide range of vital activities that are undertaken by the Ministry of Defence and its partners. We have seen that all too obviously in the last two weeks.
If the right hon. Gentleman is going to disagree with me that combat effectiveness would be a very narrow term to describe the events in Salisbury, of course I will give way.
I actually wanted to ask about interpreters who support our armed forces. There is cross-party consensus that sometimes it is important to ensure that we grant leave to remain in this country to those very brave civilians who have supported our armed forces abroad as interpreters. Sometimes, those claims have been contested by the Ministry of Defence. Is the Minister confident and satisfied that the Ministry of Defence would not be able to rely on this exemption to keep information back from civilian staff employed as interpreters in support of our armed forces abroad when they seek leave to remain in this country?
I cannot possibly be drawn on individual applications for asylum. It would be wholly improper for me to make a sweeping generalisation on cases that are taken on a case-by-case basis. I refer back to the narrow definition that was in the 1998 Act and suggest that our enlarging the narrow definition of combat effectiveness would mean including the civilian staff who support our brave troops.
The term “defence purposes” is intended to be limited in both application and scope, and will not encompass all processing activities conducted by the Ministry of Defence. Only where a specific right or obligation is found to incompatible with a specific processing activity being undertaken for defence purposes can that right or obligation be set aside. The Ministry of Defence will continue to process personal information relating to both military and civilian personnel in a secure and appropriate way, employing relevant safeguards and security in accordance with the principles of the applied GDPR. It is anticipated that standard human resources processing functions such as the recording of leave and the management of pay and pension information will not be covered by the exemption.
I am sorry to press the Minister on this point, and she may want to write to me as a follow-up, but I think Members on both sides of the House have a genuine interest in ensuring that interpreters who have supported our troops abroad are able to access important information, such as the terms of their service and the record of their employment, when making legitimate applications for leave to remain in this country—not asylum—or sometimes discretionary leave.
I am very happy to write to the right hon. Gentleman about that. The exemption does not cover all processing of personal data by the Ministry of Defence, but I am happy to write to him on that subject.
It may assist the Committee if I give a few examples of processing activities that might be considered to fall into the definition of defence purposes requiring the protection of the exemption. Such processing could include the collation of personal data to assist in assessing the capability and effectiveness of armed forces personnel, including the performance of troops; the collection and storage of information, including biometric data necessary to maintain the security of defence sites, supplies and services; and the sharing of data with coalition partners to support them in maintaining their security capability and the effectiveness of their armed forces. That is not an exhaustive list. The application of the exemption should be considered only in specific cases where the fulfilment of a specific data protection right or obligation is found to put at risk the security capability or effectiveness of UK defence activities.
The hon. Member for Sheffield, Heeley asked for a definition of national security. It has been the policy of successive Governments not to define national security in statute. Threats to national security are constantly evolving and difficult to predict, and it is vital that legislation does not constrain the security and intelligence agencies’ ability to protect the UK from new and emerging threats. For example, only a few years ago it would have been very difficult to predict the nature or scale of the threat to our national security from cyber-attacks.
Clause 26 does not provide for a blanket exemption. It can be applied only when it is required to safeguard national security or for defence purposes.
(6 years, 8 months ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship, Mr Streeter. Clause 26 creates an exemption for certain provisions in the Bill only if that exemption is required for the purpose of safeguarding national security or for defence purposes. Where processing does not meet these tests, the exemption cannot apply. It is possible to exempt from most but not all the data protection principles the rights of data subjects, certain obligations on data controllers and processors, and various enforcement provisions, where required to safeguard national security or for defence purposes. In relation to national security, the exemption mirrors the existing national security exemption provided for in section 28 of the 1998 Act. The statutory framework has long recognised that the proportionate exemptions from the data protection principles and the rights of data subjects are necessary to protect national security. The Bill does not alter that position.
The exemption for defence purposes is intended to ensure the continued protection, security and capability of our armed forces and of the civilian staff who support them—not just their combat effectiveness, to use the outdated language of the 1998 Act. In drafting this legislation, we concluded that this existing exemption was too narrow and no longer adequately captured the wide range of vital activities that are undertaken by the Ministry of Defence and its partners. We have seen that all too obviously in the last two weeks.
If the right hon. Gentleman is going to disagree with me that combat effectiveness would be a very narrow term to describe the events in Salisbury, of course I will give way.
I actually wanted to ask about interpreters who support our armed forces. There is cross-party consensus that sometimes it is important to ensure that we grant leave to remain in this country to those very brave civilians who have supported our armed forces abroad as interpreters. Sometimes, those claims have been contested by the Ministry of Defence. Is the Minister confident and satisfied that the Ministry of Defence would not be able to rely on this exemption to keep information back from civilian staff employed as interpreters in support of our armed forces abroad when they seek leave to remain in this country?
I cannot possibly be drawn on individual applications for asylum. It would be wholly improper for me to make a sweeping generalisation on cases that are taken on a case-by-case basis. I refer back to the narrow definition that was in the 1998 Act and suggest that our enlarging the narrow definition of combat effectiveness would mean including the civilian staff who support our brave troops.
The term “defence purposes” is intended to be limited in both application and scope, and will not encompass all processing activities conducted by the Ministry of Defence. Only where a specific right or obligation is found to incompatible with a specific processing activity being undertaken for defence purposes can that right or obligation be set aside. The Ministry of Defence will continue to process personal information relating to both military and civilian personnel in a secure and appropriate way, employing relevant safeguards and security in accordance with the principles of the applied GDPR. It is anticipated that standard human resources processing functions such as the recording of leave and the management of pay and pension information will not be covered by the exemption.
I am sorry to press the Minister on this point, and she may want to write to me as a follow-up, but I think Members on both sides of the House have a genuine interest in ensuring that interpreters who have supported our troops abroad are able to access important information, such as the terms of their service and the record of their employment, when making legitimate applications for leave to remain in this country—not asylum—or sometimes discretionary leave.
I am very happy to write to the right hon. Gentleman about that. The exemption does not cover all processing of personal data by the Ministry of Defence, but I am happy to write to him on that subject.
It may assist the Committee if I give a few examples of processing activities that might be considered to fall into the definition of defence purposes requiring the protection of the exemption. Such processing could include the collation of personal data to assist in assessing the capability and effectiveness of armed forces personnel, including the performance of troops; the collection and storage of information, including biometric data necessary to maintain the security of defence sites, supplies and services; and the sharing of data with coalition partners to support them in maintaining their security capability and the effectiveness of their armed forces. That is not an exhaustive list. The application of the exemption should be considered only in specific cases where the fulfilment of a specific data protection right or obligation is found to put at risk the security capability or effectiveness of UK defence activities.
The hon. Member for Sheffield, Heeley asked for a definition of national security. It has been the policy of successive Governments not to define national security in statute. Threats to national security are constantly evolving and difficult to predict, and it is vital that legislation does not constrain the security and intelligence agencies’ ability to protect the UK from new and emerging threats. For example, only a few years ago it would have been very difficult to predict the nature or scale of the threat to our national security from cyber-attacks.
Clause 26 does not provide for a blanket exemption. It can be applied only when it is required to safeguard national security or for defence purposes.
(6 years, 8 months ago)
Public Bill CommitteesI will give an example first, because I think it is so important. I fear that a bit of misunderstanding has crept in. Let us take the example of a subject access request. Mr Smith asks an intelligence service whether it is processing personal data concerning him and, if so, for information about that data under clause 94. The intelligence service considers whether it is processing personal data, which it will have obtained under its other statutory powers, such as the Regulation of Investigatory Powers Act 2000 or the Investigatory Powers Act 2016.
If the agency determines that it is processing personal data relating to Mr Smith, it then considers whether it is able to disclose the data, or whether a relevant exemption is engaged. For the agency, the key consideration will be whether disclosing the data would damage national security, for example by disclosing sensitive capabilities or alerting Mr Smith to the fact that he is a subject of investigation. If disclosure does not undermine national security and no other exemption is relevant, the intelligence service must disclose the information. However, if national security would be undermined by disclosure, the agency will need to use the national security exemption in relation to processing any personal data relating to Mr Smith.
If the intelligence service does not process any personal data relating to Mr Smith, it will again have to consider whether disclosing that fact would undermine national security, for example by revealing a lack of capability, which could be exploited by subjects of investigation. That is why, on occasion, when such requests are made, a “neither confirm nor deny” response may be necessary, because either confirming or denying may in itself have ramifications, not only in relation to Mr Smith but in relation to other aspects of national security.
Mr Smith may complain to the Information Commissioner about the response to his request for information. The intelligence service may then be required to demonstrate to the commissioner that the processing of personal data complies with the requirements of part four of the Bill, as set out in clause 102, and that it has responded to the request for information appropriately.
If, in legal proceedings, Mr Smith sought to argue that the national security exemption had been improperly relied upon, a national security certificate could be used as conclusive evidence that the national security exemption was required to safeguard national security. Any person who believed they were directly affected by the certificate could of course appeal against it to the upper tribunal, as set out in clause 111.
The Minister is setting out the mechanics of the system with admirable clarity. The point in dispute, though, is not the mechanics of the process but whether the data controller is able—unilaterally, unchecked and unfettered—to seek a national security exemption. Anyone who has worked with the intelligence agencies, either as a Minister or not, knows that they take parliamentary oversight and the defence of parliamentary supremacy extremely seriously.
What we are seeking with this amendment is to ensure that a data controller does not issue a national security certificate unchecked, and that instead there is an element of judicial oversight. The rule of law is important. It should be defended, protected and enhanced, especially when the data collection powers of the intelligence services are so much greater than they were 30 years ago when data protection legislation was first written.
The Government fully accept that national security certificates should be capable of being subject to judicial oversight. Indeed, the current scheme—both under the 1998 Act and this Bill—provides for just that. However, the amendments would radically change the national security certificate regime, because they would replace the existing scheme with one that required a Minister of the Crown to apply to a judicial commissioner for a certificate if an exemption was sought for the purposes of safeguarding national security, and for a decision to issue a certificate to be approved by a judicial commissioner.
This, again, is the debate that we had when we were considering the Investigatory Powers Act 2016. There were some who would have preferred a judicial commissioner to make the decision about warrantry before the Secretary of State. However, Parliament decided that it was not comfortable with that, because it would have meant a great change. For a member of the judiciary to certify on national security issues, rather than a member of the Executive—namely the Prime Minister or a Secretary of State—would have great constitutional implications.
There were great debates about the issue and the House decided, in its wisdom, that it would maintain the constitutional tradition, which is that a member of the Executive has the ultimate responsibility for national security, with, of course, judicial oversight by judicial commissioners and by the various tribunals that all these powers are subject to. The House decided that the decision itself must be a matter for a Minister of the Crown, because in the event—God forbid—that there is a national security incident, the House will rightly and properly demand answers from the Government of the day. With the greatest respect, a judicial commissioner cannot come to the Dispatch Box to explain how the Government and those assisting them in national security matters have responded to that situation. That is why we have this fine constitutional balance, and why we have adopted in the Bill the regime that has been in place for 30 years.
No, because those who have drafted the Bill have sought, at all times, to comply with the law enforcement directive and with the modernised, draft Council of Europe convention 108. The Bill very much meets those standards, not just on law enforcement but across parts 3 and 4.
I have spoken to the outgoing Council of Europe information commissioner about the issue, and he has put on the record his grave reservations about the regime that we have in place, because we simply do not have the right kind of judicial oversight of the information gathering powers that are now available to our intelligence services. Our intelligence services are very good, and they need to be allowed to do their job, but they will be allowed to do that job more effectively—and without additional risks to our adequacy—if there is some kind of judicial oversight in the right timeframe of the decisions that are taken.
That is where the distinction between obtaining information and processing it is so important. The gathering that the right hon. Gentleman refers to falls under the Investigatory Powers Act 2016. Retaining it and processing it in the ways that the Bill seeks to provide for is the data protection element. The 2016 Act has all the extra judicial oversights that have been passed by the House.
Quite helpfully, we are coming to the nub of the question. It is now incumbent on the Minister to lay out for the Committee why the oversight regime for obtaining information should be so remarkably different from the regime for processing it.
The obtaining of information is potentially intrusive and often extremely time-sensitive. For the processing of information, particularly in the case of a subject access request, once we have met the criteria for obtaining it, separate judicial oversight through the upper tribunal is set out in the Bill, as well as ministerial oversight. They are two separate regimes.
There is extra oversight in the 2016 Act because obtaining information can be so intrusive. The right hon. Gentleman will appreciate that I cannot go into the methodology—I am not sure I am security-cleared enough to know, to be honest—but obtaining information has the potential to be particularly intrusive, in a way that processing information gathered by security service officials may not be.
I reassure the Minister that I went through the methodologies during my time at the Home Office. The justification that she still needs to lay out for the Committee—she is perhaps struggling to do so—is why there should be one set of judicial oversight arrangements for obtaining information and another for processing it. Why are they not the same?
There might be many reasons why we process information. The end result of processing might be for national security reasons or law enforcement reasons—my officials are scribbling away furiously, so I do not want to take away their glory when they provide me with the answer.
I have an answer on the Watson case, raised by the hon. Member for Sheffield, Heeley, which dealt with the retention of communications by communications service providers. Again, that is an entirely different scenario from the one we are talking about, where the material is held by the security services.
Amendment 161 goes further than the 2016 Act, because it places the decision to issue a certificate with the judicial commissioner. As I have said, national security certificates come into play only to serve in legal proceedings as conclusive evidence that an exemption from specified data protection requirements is necessary to protect national security—for example, to prevent disclosure of personal data to an individual under investigation, when such disclosure would damage national security. The certificate does not authorise the required use of the national security exemption, which is properly a matter for the data controller to determine.
Amendments 163 and 164 relate to the form of a national security certificate. Amendment 163 would require a detailed rather than general description of the data identified on a national security certificate, but we believe this change to be unnecessary and unhelpful, given that much data can be adequately described in a general way. Amendment 164, which would prevent a certificate from having prospective effect, appears to be dependent on the prior judicial authorisation scheme proposed in amendments 161 and 162, and again contrasts with the prospective nature of certificates currently under the Data Protection Act 1998.
Prospective certificates of the type issued under the 1998 Act are the best way of ensuring that the use of the national security exemption by the intelligence services and others is both sufficiently foreseeable for the purposes of article 8 of the European convention on human rights, and accountable. The accountability is ensured by the power to challenge certificates when they are issued, and that is something that has real teeth. The accountability is strengthened by the provision in clause 130 for the publication of certificates. The documents we are discussing will therefore be in the public domain—indeed, many of them are already. But it will now be set out in statute that they should be in the public domain.
Amendments 166 to 168 relate to the appeals process. Amendment 166 would broaden the scope for appealing a national security certificate from a person “directly affected” by it to someone who
“believes they are directly or indirectly affected”
by it. I wonder whether the Opposition did any work on the scope of the provision when drafting it, because the words “indirectly affected” have the potential to cause an extraordinary number of claims. How on earth could that phrase be defined in a way that does not swamp the security services with applications from people who consider that they might be indirectly affected by a decision relating to a national security matter? I do not see how that can be considered practicable.
On the judicial review point, the test was debated at length in the Joint Committee, in the Public Bill Committee and on the Floor of the House. The House passed that Act with cross-party consensus, as my hon. Friend has said, so I do not understand why we are having the same debate.
Anyone who has spent time working with our intelligence agencies knows that they see their mission as the defence of parliamentary democracy. They believe in scrutiny and oversight, which is what we are trying to insert in the Bill. The reason the Investigatory Powers Bill was passed in that way was because we were successful in ensuring that there were stronger safeguards. The Minister has been unable to explain today why the safeguarding regime should be different for the processing of data as opposed to the obtaining of data. We have heard no convincing arguments on that front today. All that we are seeking to do is protect the ability of the intelligence agencies to do their job by ensuring that a guard against the misuse of their much broader powers is subject to effective judicial oversight, and not in public but in a court.
For the security services to have obtained data under the Investigatory Powers Act, they will have passed through the various safeguards that Parliament set out in that Act. Once that data is obtained, it follows that the permission that the judicial commissioner will have reviewed will still flow through to the processing of that information. Our concern here is certain requirements of the data protection regime. The decision to disseminate information under that regime must rest with the intelligence agencies, with oversight. The Bill provides for those decisions to be appealed. That is as it should be. It should not be for a judicial commissioner to take over the decision of the data controller, who is processing applications and information in real time, often in situations that require them to act quickly. Likewise, whether to grant a certificate, which will be in the public domain, must be a decision for a member of the Executive, not the judiciary.
I assume that no work has been done to measure the scope of amendment 166, but allowing the clause to cover people indirectly affected could have enormous consequences for the security services, which already face great pressures and responsibilities.
Amendments 167 and 168 would remove the application of judicial review principles by the upper tribunal when considering an appeal against a certificate. They would replace the “reasonable grounds for issuing” test with a requirement to consider whether issuing a certificate was necessary and proportionate. Again, that would be an unnecessary departure from the existing scheme, which applies the judicial review test and has worked very well for the past 30 years.
In applying judicial review principles, the upper tribunal can consider a range of issues, including necessity, proportionality and lawfulness. As we set out in our response to the report of the House of Lords Constitution Committee, that enables the upper tribunal to consider matters such as whether the decision to issue the certificate was reasonable, having regard to the impact on the rights of the data subject and the need to safeguard national security. The Bill makes it clear that the upper tribunal has the power to quash the certificate if it concludes that the decision to issue it was unreasonable.
I hope that I have answered the concerns of the right hon. Member for Birmingham, Hodge Hill about how certificates are granted and about the review process when a subject access request is made and the certificate is applied. We must recognise that the Bill does not weaken a data subject’s rights or the requirements that must be met if an exemption is to be relied on; it reflects the past 30 years of law. Perhaps I missed it, but I do not think that any hon. Member has argued that the Data Protection Act 1998 has significant failings.
As the Minister well knows, the debate internationally is a result of the radical transformation of intelligence agencies’ ability to collect and process data. There is an argument, which has been well recognised in the Council of Europe and elsewhere, that where powers are greater, oversight should be stronger.
Yes, and that is precisely why Parliament passed the Investigatory Powers Act 2016.
The safeguards that apply once the information has been obtained—
Very briefly, subsection (1) includes the phrase
“must be lawful and fair”.
Could the Minister say a little more about the word “fair”? What definition is she resting on, and who is the judge of it?
“Lawful” means any processing necessary to carry out a particular task, where that task is authorised either by statute or under common law. It would cover, for example, the taking and retention of DNA and fingerprints under the Police and Criminal Evidence Act 1984, or the police’s common law powers to disclose information required for the operation of the domestic violence disclosure scheme.
The Government recognise the importance of safeguarding sensitive personal information about individuals. Subsections (3) to (5) therefore restrict the processing of sensitive data, the definition of which includes information about an individual’s race or ethnic origin, and biometric data such as their DNA profile and fingerprints.
Further safeguards for the protection of sensitive personal data are set out in clause 42. The processing of sensitive personal data is permitted under two circumstances. The first is where the data subject has given his or her consent. The second is where the processing is strictly necessary for a law enforcement purpose and one or more of the conditions in schedule 8 to the Bill has been met. Those conditions include, for example, that the processing is necessary to protect the individual concerned or another person, or is necessary for the administration of justice. In both cases, the controller is required to have an appropriate policy document in place. We will come on to the content of such policy documents when we debate clause 42.
I am grateful for the Minister’s extensive definition, given in response to a question I did not ask. I did not ask for the definition of “lawful” but for the definition of “fair”.
I am so sorry; I thought it was apparent from my answer. “Fair” is initially a matter for the data controller, but ultimately the Information Commissioner has oversight of these provisions and the commissioner will cover that in her guidance.
Question put and agreed to.
Clause 35 accordingly ordered to stand part of the Bill.
Schedule 8
Conditions for sensitive processing under Part 3
Amendment made: 116, in schedule 8, page 184, line 32, at end insert—
“Safeguarding of children and of individuals at risk
3A (1) This condition is met if—
(a) the processing is necessary for the purposes of—
(i) protecting an individual from neglect or physical, mental or emotional harm, or
(ii) protecting the physical, mental or emotional well-being of an individual,
(b) the individual is—
(i) aged under 18, or
(ii) aged 18 or over and at risk,
(c) the processing is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) the processing is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) For the purposes of this paragraph, an individual aged 18 or over is “at risk” if the controller has reasonable cause to suspect that the individual—
(a) has needs for care and support,
(b) is experiencing, or at risk of, neglect or physical, mental or emotional harm, and
(c) as a result of those needs is unable to protect himself or herself against the neglect or harm or the risk of it.
(4) In sub-paragraph (1)(a), the reference to the protection of an individual or of the well-being of an individual includes both protection relating to a particular individual and protection relating to a type of individual.”—(Victoria Atkins.)
Schedule 8 makes provision about the circumstances in which the processing of special categories of personal data is permitted. This amendment adds to that Schedule certain processing of personal data which is necessary for the protection of children or of adults at risk. See also Amendments 85 and 117.
Schedule 8, as amended, agreed to.
Clauses 36 to 40 ordered to stand part of the Bill.
Clause 41
Safeguards: archiving
Amendment made: 20, in clause 41, page 23, line 34, leave out “an individual” and insert “a data subject”.—(Victoria Atkins.)
Clause 41 makes provision about the processing of personal data for archiving purposes, for scientific or historical research purposes or for statistical purposes. This amendment aligns Clause 41(2)(b) with similar provision in Clause 19(2).
Question proposed, That the clause, as amended, stand part of the Bill.
We had a good debate on what I think was a shared objective across the Committee: to ensure that those running our big national archives—whether they are large or small organisations—should not be jeopardised by frivolous claims or, indeed, a multiplicity of claims from individuals who might seek to change the records held there in one way or another. I mentioned to the Minister in an earlier debate that we were anxious, despite the reassurances she sought to give the Committee, that a number of organisations, including the BBC, were deeply concerned about the Bill’s impact on their work. They were not satisfied that the exemptions and safeguards in the Bill would quite do the job.
My only reason for speaking at this stage is to suggest to Ministers that if they were to have discussions with some of those organisations about possible Government amendments on Report to refine the language, and provide some of the reassurance people want, that would attract our support. We would want to have such conversations, but it would be better if the Government could find a way to come forward with refinements of their own on Report.
I am happy to explore that. The reason for the clause is to enable processing to be done to create an archive for scientific or historical research, or for statistical purposes. The reason law enforcement is mentioned is that it may be necessary where a law enforcement agency needs to review historic offences, such as allegations of child sexual exploitation. I would of course be happy to discuss that with the right hon. Gentleman to see whether there are further avenues down which we should proceed.
I am grateful to the Minister for that response. I am happy to write to her with the representations that we have received, and perhaps she could reflect on those and write back.
Question put and agreed to.
Clause 41, as amended, accordingly ordered to stand part of the Bill.
Clause 42
Safeguards: sensitive processing
Amendment made: 21, in clause 42, page 24, line 29, leave out “with the day” and insert “when”.—(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 42, as amended, ordered to stand part of the Bill.
Clauses 43 to 46 ordered to stand part of the Bill.
Clause 47
Right to erasure or restriction of processing
I beg to move amendment 22, in clause 47, page 28, line 20, leave out second “data”.
This amendment changes a reference to a “data controller” into a reference to a “controller” (as defined in Clauses 3 and 32).
I can be brief, because this drafting amendment simply ensures that clause 47, as with the rest of the Bill, refers to a “controller” rather than a “data controller”. For the purposes of part 3, a controller is defined in clause 32(1) so it is not necessary to refer elsewhere to a “data controller”.
Amendment 22 agreed to.
Clause 47, as amended, ordered to stand part of the Bill.
Clause 48 ordered to stand part of the Bill.
Clause 49
Right not to be subject to automated decision-making
Question proposed, That the clause stand part of the Bill.
We had a good debate on possible amendments to the powers of automatic decision making earlier and this is an important clause in that it creates a right not to be subject to automated decision making. Clause 49(1) states:
“A controller may not take a significant decision based solely on automated processing unless that decision is required or authorised by law.”
I hope Ministers recognise that
“required or authorised by law”
is an incredibly broad set of questions. I would like to provoke the Minister into saying a little more about what safeguards she believes will come into place to ensure that decisions are not taken that jeopardise somebody’s human rights and their right to appeal and justice based on those human rights. It could be that the Minister decides to answer those questions in the debate on clause 50, but it would be useful for her to say a little more about her understanding of the phrase “significant decision” and a little more about what kind of safeguards will be needed to ensure that decisions that are cast in such a broad way do not impact on people in a negative way.
Clause 49 establishes the right for individuals not to be subject to a decision based exclusively on automated processing, where that decision has an adverse impact on the individual. It is important to protect that right to enhance confidence in law enforcement processing and safeguard individuals against the risk that a potentially damaging decision is taken without human intervention. The right hon. Gentleman asked about the definition of a significant decision. It is set out in the Bill.
We are not aware of any examples of the police solely using automated decision-making methods, but there may be examples in other competent authorities. The law enforcement directive includes that requirement, so we want to transpose it faithfully into statute, and we believe we have captured the spirit of the requirement.
I wonder whether that is captured in the spirit of the Bill. Forgive me, Mr Hanson. This is my first Bill Committee as a Minister and I was not aware of that. Many apologies.
I am not familiar with that example. It would be a very interesting exercise under the PACE custody arrangements. I will look into it in due course. These protections transpose the law enforcement directive, and we are confident that they meet those requirements.
Question put and agreed to.
Clause 49 accordingly ordered to stand part of the Bill.
Clause 50
Automated decision-making authorised by law: safeguards
Amendments made: 23, in clause 50, page 30, line 11, leave out “21 days” and insert “1 month”.
Clause 50(2)(b) provides that where a controller notifies a data subject under Clause 50(2)(a) that the controller has taken a “qualifying significant decision” in relation to the data subject based solely on automated processing, the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.
Amendment 24, in clause 50, page 30, line 17, leave out “21 days” and insert “1 month”.—(Victoria Atkins.)
Clause 50(3) provides that where a data subject makes a request to a controller under Clause 50(2)(b) to reconsider or retake a decision based solely on automated processing, the controller has 21 days to respond. This amendment extends that period to one month.
Question proposed, That the clause, as amended, stand part of the Bill.
I remain concerned that the safeguards the Government have proposed to ensure people’s human rights are not jeopardised by the use of automated decision making are, frankly, not worth the paper they are written on. We know that prospective employers and their agents use algorithms and automated systems to analyse very large sets of data and, through the use of artificial intelligence and machine learning, make inferences about whether people are appropriate to be considered to be hired or retained by a particular company. We have had a pretty lively debate in this country about the definition of a worker, and we are all very grateful to Matthew Taylor for his work on that question. Some differences emerged, and the Business, Energy and Industrial Strategy Committee has put its views on the record.
The challenge is that our current labour laws, which were often drafted decades ago, such as the Sex Discrimination Act 1975 and the Race Relations Act 1965, are no longer adequate to protect people in this new world, in which employers are able to use such large and powerful tools for gathering and analysing data, and making decisions.
We know that there are problems. We already know that recruiters use Facebook to seek candidates in a way that routinely discriminates against older workers by targeting job advertisements. That is not a trivial issue; it is being litigated in the United States. In the United Kingdom, research by Slater and Gordon, a group of employment lawyers, found that one in five bosses admits to unlawful discrimination when advertising jobs online. Women and people over 50 are most likely to be stopped from seeing an advert. Around 32% of company executives admitted to discriminating among those over 50; 23% discriminated against women; and 62% of executives who had access to profiling tools admitted to using them to actively seek out people based on criteria such as age, gender and race. Female Uber drivers earn 7% less than men when pay is determined by algorithms. A number of practices in the labour market are disturbing and worrying, and they should trouble all of us.
The challenge is that clause 50 needs to include a much more comprehensive set of rights and safeguards. It should clarify that the Equality Act 2010 and protection from discrimination applies to all new forms of decision making that engage core labour rights around recruitment, terms of work or dismissal. There should be new rights about algorithmic fairness at work to ensure equal treatment where an algorithm or automated system takes a decision that impinges on someone’s rights. There should be a right to explanation where significant decisions are taken based on an algorithm or an automated decision. There is also a strong case to create a duty on employers, if they are a large organisation, to undertake impact assessments to check whether they are, often unwittingly, discriminating against people in a way that we think is wrong.
Over the last couple of weeks, we have seen real progress in the debate about gender inequalities in pay. Many of us will have looked in horror at some of the news that emerged from the BBC and at some of the evidence that emerged from ITV and The Guardian. We have to contend with the reality that automated decision-making processes are under way in the labour market that could make inequality worse rather than better. The safeguards that we have in clause 50 do not seem up to the job.
I hope the Minister will say a bit more about the problems that she sees with future algorithmic decision making. I am slightly troubled that she is unaware of some live examples in the Home Office space in one of our most successful police forces, and there are other examples that we know about. Perhaps the Minister might say more about how she intends to improve the Bill with regard to that issue between now and Report.
I will pick up on the comments by the right hon. Gentleman, if I may.
In the Durham example given by the hon. Member for Sheffield, Heeley, I do not understand how a custody sergeant could sign a custody record without there being any human interaction in that decision-making process. A custody sergeant has to sign a custody record and to review the health of the detainee and whether they have had their PACE rights. I did not go into any details about it, because I was surprised that such a situation could emerge. I do not see how a custody sergeant could be discharging their duties under the Police and Criminal Evidence Act 1984 if their decision as to custody was based solely on algorithms, because a custody record has to be entered.
This has been a moment of genuine misunderstanding. Given how the hon. Lady presented that, to me it sounded as if she was saying that the custody record and the custody arrangements of a suspect—detaining people against their will in a police cell—was being done completely by a computer. That was how it sounded. There was obviously an area of genuine misunderstanding, so I am grateful that she clarified it. She intervened on me when I said that we were not aware of any examples of the police solely using automated decision making—that is when she intervened, but that is not what she has described. A human being, a custody sergeant, still has to sign the record and review the risk assessment to which the hon. Lady referred. The police are using many such examples nowadays, but the fact is that a human being is still involved in the decision-making process, even in the issuing of penalties for speeding. Speeding penalties may be automated processes, but there is a meaningful element of human review and decision making, just as there is with the custody record example she gave.
There was a genuine misunderstanding there, but I am relieved, frankly, given that the right hon. Member for Birmingham, Hodge Hill was making points about my being unaware of what is going on in the Home Office. I am entirely aware of that, but I misunderstood what the hon. Lady meant and I thought she was presenting the custody record as something that is produced by a machine with no human interaction.
Line-by-line scrutiny, but I was acting in good faith on an intervention that the hon. Member for Sheffield, Heeley made when I was talking about any examples of the police solely using automated decision making.
I have lost track of which point the right hon. Gentleman wants me to give way on.
Let me remind the Minister. What we are concerned about on the question of law enforcement is whether safeguards that are in place will be removed under the Bill. That is part and parcel of a broader debate that we are having about whether the safeguards that are in the Bill will be adequate. So let me return to the point I made earlier to the Minister, which is that we would like her reflections on what additional safeguards can be drafted into clauses 50 and 51 before Report stage.
Clause 49 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly or adversely has an impact on them, legally or otherwise, unless required by law. If that decision is required by law, clause 50 specifies the safeguards that controllers should apply to ensure that the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and giving that individual 21 days in which to ask the controller to reconsider the decision, or to retake the decision with human intervention.
A point was made about the difference between automated processing and automated decision making. Automated processing is when an operation is carried out on personal data using predetermined fixed parameters that allow for no discretion by the system and do not involve further human intervention in the operation to produce a result or output. Such processing is used regularly in law enforcement to filter large datasets down to manageable amounts for a human operator to use. Automated decision making is a form of automated processing that allows the system to use discretion, potentially based on algorithms, and requires the final decision to be made without human interference. The Bill seeks to clarify that, and the safeguards are set out in clause 50.
Question put and agreed to.
Clause 50, as amended, accordingly ordered to stand part of the Bill.
Clause 51
Exercise of rights through the Commissioner
These technical amendments are required to ensure that the provisions in clause 51 do not inadvertently undermine criminal investigations by the police or other competent authorities. Under the Bill, where a person makes a subject access request, it may be necessary for the police or other competent authority to give a “neither confirm nor deny” response, for example in order to avoid tipping someone off that they are under investigation for a criminal offence. In such a case, the data subject may exercise their rights under clause 51 to ask the Information Commissioner to check that the processing of their personal data complies with the provisions in part 3. It would clearly undermine a “neither confirm nor deny” response to a subject access request if a data subject could use the provisions in part 3 to secure confirmation that the police were indeed processing their information.
It is appropriate that the clause focuses on the restriction of a data subject’s rights, not on the underlying processing. The amendments therefore change the nature of the request that a data subject may make to the commissioner in cases where rights to information are restricted under clause 44(4) or clause 45(4). The effect of the amendments is that a data subject will be able to ask the commissioner to check that the restriction was lawful. The commissioner will then be able to respond to the data subject in a way that does not undermine the original “neither confirm nor deny” response.
This is a significant amendment—I understand the ambition behind the clause—so it is worth dwelling on it for a moment. I would like to check my understanding of what the Minister said. In a sense, if an investigation is under way and the individual under investigation makes a subject access request to the police and gets a “neither confirm nor deny” response, the data subject will be able to ask the Information Commissioner to investigate. Will the Minister say a little more about what message will go from the police to the Information Commissioner and the content of the message that will go from the Information Commissioner to the data subject? I have worked on such cases in my constituency. Often, there is an extraordinary spiral of inquiries and the case ultimately ends up in a judicial review in court. Will the Minister confirm that I have understood the mechanics accurately and say a little more about the content of the messages from the police to the Information Commissioner and from the Information Commissioner to the person who files the request?
I can help the right hon. Gentleman in one respect: he has understood the mechanics. I am afraid that I cannot give him examples, because it will depend on the type of criminal offence or the type of investigation that may be under way. I cannot possibly give him examples of the information that may be sent by the police to the Information Commissioner, because that will depend entirely on the case that the police are investigating.
Perhaps I can pose the question in a sharper way. I do not think that is entirely the case. It must be possible for the Minister to be a little more specific, and perhaps a little more knowledgeable, about the content of the message that will go from the Information Commissioner to the data subject. Will that be a standard message? Will it be in any way detailed? Will it reflect in any way on the information that the police provide? Or will it simply be a blank message such as “I, the Information Commissioner, am satisfied that your information has been processed lawfully”? I do not think the Information Commissioner is likely to ask for too much detail about the nature of the offence, but she will obviously ask whether data has been processed lawfully. She will want to make checks in that way. Unless the Information Commissioner is able to provide some kind of satisfactory response to the person who has made the original request, we will end up with an awful administrative muddle that will take of lot of the courts’ time. Perhaps the Minister could put our minds at rest on that.
The Information Commissioner will get the information but, by definition, she does not give that information to the subject, because law enforcement will have decided that it meets the criteria for giving a “neither confirm nor deny” response from their perspective. The commissioner then looks at the lawfulness of that; if she considers it to be lawful, she will give the same response—that the processing meets part 3 obligations.
Amendment 25 agreed to.
Amendment made: 26, in clause 51, page 31, line 11, leave out from first “the” to end of line 12 and insert “restriction imposed by the controller was lawful;” —(Victoria Atkins.)
This amendment is consequential on Amendment 25.
Clause 51, as amended, ordered to stand part of the Bill.
Clause 52 ordered to stand part of the Bill.
Clause 53
Manifestly unfounded or excessive requests by the data subject
Amendments made: 27, in clause 53, page 31, line 39, leave out “or 47” and insert “,47 or 50”.
Clause 53(1) provides that where a request from a data subject under Clause 45, 46 or 47 is manifestly unfounded or excessive, the controller may charge a reasonable fee for dealing with the request or refuse to act on the request. This amendment applies Clause 53(1) to requests under Clause 50 (automated decision making). See also Amendment 28.
Amendment 28, in clause 53, page 32, line 4, leave out “or 47” and insert “,47 or 50”.—(Victoria Atkins.)
Clause 53(3) provides that where there is an issue as to whether a request under Clause 45, 46 or 47 is manifestly unfounded or excessive, it is for the controller to show that it is. This amendment applies Clause 53(3) to requests under Clause 50 (automated decision making). See also Amendment 27.
Question proposed, That the clause, as amended, stand part of the Bill.
We have just agreed a set of amendments that, on the face of it, look nice and reasonable. We can all recognise the sin that the Government are taking aim at, and that the workload of the Information Commissioner’s Office and of others has to be kept under control, so we all want to deter tons of frivolous and meaningless requests. None the less, a lot of us have noticed that, for example, the introduction of fees for industrial tribunals makes it a lot harder for our constituents to secure justice.
I wonder, having now moved the amendment successfully, whether the Minister might tell us a little more about what will constitute a reasonable fee and what will happen to those fees. Does she see any relationship between the fees being delivered to her Majesty’s Government and the budget that is made available for the Information Commissioner? Many of us are frankly worried, given the new obligations of the Information Commissioner, about the budget she has to operate with and the resources at her disposal. Could she say a little more, to put our minds at rest, and reassure us that these fees will not be extortionate? Where sensible fees are levied, is there some kind of relationship with the budget that the Information Commissioner might enjoy?
Clause 35 establishes the principle that subject access requests should be provided free of charge in most cases. That will be the default position in most cases. In terms of the fees, that will not be a matter to place in statute; certainly, I can write to the right hon. Gentleman with my thoughts on how that may develop. The intention is that in the majority of cases, there will be no charge.
Question put and agreed to.
Clause 53, as amended, accordingly ordered to stand part of the Bill.
Clause 54
Meaning of “applicable time period”
Amendments made: 29, in clause 54, page 32, line 14, leave out “day” and insert “time”.
This amendment is consequential on Amendment 71.
Amendment 30, in clause 54, page 32, line 15, leave out “day” and insert “time”.—(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 54, as amended, ordered to stand part of the Bill.
Clauses 55 to 63 ordered to stand part of the Bill.
Clause 64
Data protection impact assessment
I am extremely grateful to the hon. Lady for clarifying her role. My answer is exactly as I said before. High risk includes processing where there is a particular likelihood of prejudice to the rights and freedoms of data subjects. That must be a matter for the data controller to assess. We cannot assess it here in Committee for the very good reason put forward by members of the Committee: we cannot foresee every eventuality. Time will move on, as will technology. That is why the Bill is worded as it is, to try to future-proof it but also, importantly, because the wording complies with our obligations under the law enforcement directive and under the modernised draft Council of Europe convention 108.
Does the Minister not have some sympathy with the poor individuals who end up being data controllers for our police forces around the country, given the extraordinary task that they have to do? She is asking those individuals to come up with their own frameworks of internal guidance for what is high, medium and low risk. The bureaucracy-manufacturing potential of the process she is proposing will be difficult for police forces. We are trying to help the police to do their job, and she is not making it much easier.
The Committee is looking for some guidance and for tons of reassurance from the Minister about how the clause will bite on data processors who do not happen to base their operations here in the United Kingdom. This morning we debated the several hundred well-known data breaches around the world and highlighted some of the more recent examples, such as Yahoo!—that was probably the biggest—and AOL. More recently, organisations such as Uber have operated their systems with such inadequacy that huge data leaks have occurred, directly infringing the data protection rights of citizens in this country. The Minister will correct me if I am wrong, but I am unaware of any compensation arrangements that Uber has made with its drivers in this country whose data was leaked.
Even one of the companies closest to the Government—Equifax, which signed a joint venture agreement with the Government not too long ago—has had a huge data breach. It took at least two goes to get a full account from Equifax of exactly what had happened, despite the fact that Her Majesty’s Government were its corporate partner and had employed it through the Department for Work and Pensions. All sorts of information sharing happened that never really came to light. I am not sure whether any compensation for Equifax data breaches has been paid to British citizens either.
My point is that most citizens of this country have a large amount of data banked with companies that operate from America under the protection of the first amendment. There is a growing risk that in the years to come, more of the data and information service providers based in the UK will go somewhere safer, such as Ireland, because they are worried about the future of our adequacy agreement with the European Commission. We really need to understand in detail how the Information Commissioner, who is based here, will take action on behalf of British citizens against companies in the event of data breaches. For example, how will she ensure notification within 72 hours? How will she ensure the enforcement of clause 67(4), which sets out the information that customers and citizens must be told about the problem?
This morning we debated the Government’s ludicrous proposals for class action regimes, which are hopelessly inadequate and will not work in practice. We will not have many strong players in the UK who are able to take action in the courts, so we will be wholly reliant on the Information Commissioner to take action. I would therefore be grateful if the Minister reassured the Committee how the commissioner will ensure that clause 67 is enforced if the processor of the data is not on our shores.
The right hon. Gentleman refers to companies not on these shores, about which we had a good deal of discussion this morning. Clause 67 belongs to part 3 of the Bill, which is entitled “Law enforcement processing”, so I am not sure that the companies that he gives as examples would necessarily be considered under it. I suppose a part 3 controller could have a processor overseas, but that would be governed by clause 59. Enforcement action would, of course, be taken by the controller under part 3, but I am not sure that the right hon. Gentleman’s examples are relevant to clause 67.
I am grateful to the Minister for that helpful clarification. Let me phrase the question differently, with different examples. The Home Office and many police forces are outsourcing many of their activities, some of which are bound to involve data collected by global organisations such as G4S. Is she reassuring us that any and all data collected and processed for law enforcement activities will be held within the boundaries of the United Kingdom and therefore subject to easy implementation of clause 67?
The controller will be a law enforcement agency, to which part 3 will apply. I note that clause 200 provides details of the Bill’s territorial application should a processor be located overseas, but under part 3 it will be law enforcement agencies that are involved.
Where G4S, for example, is employed to help with deportations, the Minister is therefore reassuring us that the data controller would never be G4S. However, if there were an activity that was clearly a law enforcement activity, such as voluntary removal, would the data controller always be in Britain and therefore subject to clause 67, even where private sector partners are used? The Minister may outsource the contract, but we want to ensure that she does not outsource the role of data controller so that a law enforcement activity here can have a data controller abroad.
I appreciate the sentiment behind the amendment. If the Home Office outsources processing to an overseas company, any enforcement action would be taken against the Home Office as the controller. The right hon. Gentleman has raised the example of G4S in the immigration context, so I will reflect on that overnight and write to him to ensure that the answer I have provided also covers that situation.
Question put and agreed to.
Clause 67 accordingly ordered to stand part of the Bill.
Clause 68 to 71 ordered to stand part of the Bill.
Clause 72
Overview and interpretation
Question proposed, That the clause stand part of the Bill.
I want to flag up an issue that we will stumble across in a couple of stand part debates: the safeguards that will be necessary for data sharing between this country and elsewhere. We will come on to the safeguards that will be necessary for the transfer of data between our intelligence agencies and foreign intelligence agencies. Within the context of this clause, which touches on the broad principle of data sharing from here and abroad, I want to rehearse one or two arguments on which Ministers should be well briefed and alert.
Our intelligence agencies do an extraordinary job in keeping this country safe, which sometimes involves the acquisition and use of data that results in the loss of life. All Committee members will be familiar with the drone strike that killed Reyaad Khan and Ruhul Amin, and many of us will have heard the Prime Minister’s assurances in the Liaison Committee about the robust legal process that was gone through to ensure that the strike was both proportionate and legal.
The challenge—the public policy issue that arises under chapter 5 of the Bill—is that there is a number of new risks. First, there is the legal risk flagged up by the Court of Appeal in 2013, when justices said that it was not clear that UK personnel will be immune from criminal liability for their involvement in a programme that involves the transfer of intelligence from an intelligence service here to an American partner and where that American partner uses that information to conduct drone strikes that involve the loss of life. Confidence levels differ, but we in the Committee are pretty confident about the legal safeguards around those kinds of operations in this country. We can be less sure about the safeguards that some of our partners around the world have in place. The Court of Appeal has expressed its view, which was reinforced in 2016 by the Joint Committee on Human Rights. The Committee echoed the finding that
“front-line personnel…should be entitled to more legal certainty”
than they have today.
This section of the Bill gives us the opportunity to ensure that our intelligence services are equipped with a much more robust framework than they have today, to ensure that they are not subject to the risks flagged by the Court of Appeal or by the Joint Committee on Human Rights.
(6 years, 8 months ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship, Mr Hanson. I am pleased to introduce this group of amendments, which relate to data processing for safeguarding purposes. The amendments respond to an issue raised in an amendment tabled by Lord Stevenson on Report in the Lords in December. In response to that amendment, Lord Ashton made it clear that the Government are sympathetic to the points Lord Stevenson raised and undertook to consider the matter further. Amendments 85, 116 and 117 are the result of that consideration.
I am grateful to Lord Stevenson for raising this issue, and for his contribution to what is probably the most important new measure that we intend to introduce to the Data Protection Bill. The amendments will ensure that sensitive data can be processed without consent in certain circumstances for legitimate safeguarding activities that are in the substantial public interest. We have been working across government and with stakeholders in the voluntary and private sectors to ensure that the amendments are fit for purpose and cover the safeguarding activities expected of organisations responsible for children and vulnerable adults.
The Government recognise that statutory guidance and regulator expectations place moral, if not legal, obligations on certain organisations to ensure that measures are in place to safeguard children and vulnerable adults. Amendment 85 covers processing that is necessary for protecting children and vulnerable adults from neglect or physical or mental harm. This addresses the gap in relation to expectations on, for example, sports governing bodies.
The Government have produced cross-agency and cross-governmental guidance called “Working Together to Safeguard Children”, which rightly places the responsibility of safeguarding children on all relevant professionals who come into contact with children and families. For example, it creates an expectation that those volunteering at a local sports club will assess the needs of children and, importantly, will take action to protect them from abuse.
Amendment 85 permits the processing of sensitive personal data, which is necessary to safeguard children from physical, emotional, sexual and neglect-based abuse. Amendment 84 makes a consequential drafting change, while amendments 116 and 117 make an analogous change to the regimes in parts 3 and 4 of the Bill. This is aimed at putting beyond doubt a controller’s ability to safeguard children and people at risk.
I thought an example might help the Committee to understand why we place such an emphasis on the amendments. An example provided by a sports governing body is that a person may make an allegation or complaint about a volunteer that prompts an investigation. Such investigations can include witness statements, which reference sensitive personal data, including ethnicity, religious or philosophical beliefs, sexual orientation and health data.
In some instances, the incident may not reach a criminal standard. In those cases, the sports body may have no legal basis for keeping the data. Keeping a record allows sports bodies to monitor any escalation in conduct and to respond appropriately. Forcing an organisation to delete this data from its records could allow individuals that we would expect to be kept away from children to remain under the radar and potentially leave children at risk.
Amendment 86 deals with a related issue where processing health data is necessary to protect an individual’s economic wellbeing, where that individual has been identified as an individual at economic risk. UK banks have a number of regulatory obligations and expectations which are set out in the Financial Conduct Authority’s rules and guidance. In order to meet best practice standards in relation to safeguarding vulnerable customers, banks occasionally need to record health data without the consent of the data subject.
An example was given of a bank which was contacted by a family member who was alerting the bank to an elderly customer suffering from mental health problems who was drawing large sums of money each day from their bank account and giving it away to a young drug addict whom they had befriended. The bank blocked the account while the family sought power of attorney. Again, the amendment seeks to clarify the position and give legal certainty to banks and other organisations where that sort of scenario arises or where, for example, someone suffers from dementia and family members ask banks to take steps to protect that person’s financial wellbeing.
The unfortunate reality is that there still exists a great deal of uncertainty under current law about what personal data can be processed for safeguarding purposes. My brief of crime, vulnerability and safeguarding means that all too often—perhaps in the context of domestic abuse—agencies will gather, sadly, to conduct a domestic homicide review and discover that had certain pieces of information been shared more freely, perhaps more action could have been taken by the various agencies and adults and children could have been safeguarded.
These amendments are aimed at tackling these issues. We want to stop the practice whereby some organisations have withheld information from the police and other law enforcement agencies for fear of breaching data protection law and other organisations have been unclear as to whether consent to processing personal data is required in circumstances where consent would not be reasonable or appropriate. The amendments intend to address the uncertainty by providing relevant organisations with a specific processing condition for processing sensitive personal data for safeguarding purposes. I beg to move.
I rise to put on record my thanks to the Minister for listening carefully to my noble Friend Lord Stevenson. There was strong cross-party consensus on these common-sense reforms.
We all know that in our own constituencies there are extraordinary people doing extraordinary things in local groups. They are the life-blood of our communities. Many of them will be worried about the new obligations that come with the general data protection regulation and many of them will take a least-risk approach to meeting the new regulations. Putting in place some common safeguards to ensure that it is possible to keep data that allow us to spot important patterns of behaviour that can lead to appropriate investigations is very sensible and wise. These amendments will therefore be made with cross-party support.
Amendment 84 agreed to.
Amendments made: 85, in schedule 1, page 126, line 38, at end insert—
“Safeguarding of children and of individuals at risk
14A (1) This condition is met if—
(a) the processing is necessary for the purposes of—
(i) protecting an individual from neglect or physical, mental or emotional harm, or
(ii) protecting the physical, mental or emotional well-being of an individual,
(b) the individual is—
(i) aged under 18, or
(ii) aged 18 or over and at risk,
(c) the processing is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) the processing is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) For the purposes of this paragraph, an individual aged 18 or over is “at risk” if the controller has reasonable cause to suspect that the individual—
(a) has needs for care and support,
(b) is experiencing, or at risk of, neglect or physical, mental or emotional harm, and
(c) as a result of those needs is unable to protect himself or herself against the neglect or harm or the risk of it.
(4) In sub-paragraph (1)(a), the reference to the protection of an individual or of the well-being of an individual includes both protection relating to a particular individual and protection relating to a type of individual.”
Part 2 of Schedule 1 describes types of processing of special categories of personal data which meet the requirement in Article 9(2)(g) of the GDPR (processing necessary for reasons of substantial public interest) for a basis in UK law (see Clause 10(3)). This amendment adds to Part 2 of Schedule 1 certain processing of personal data which is necessary for the protection of children or of adults at risk. See also Amendments 116 and 117.
Amendment 86, in schedule 1, page 126, line 38, at end insert—
“Safeguarding of economic well-being of certain individuals
14B (1) This condition is met if the processing—
(a) is necessary for the purposes of protecting the economic well-being of an individual at economic risk who is aged 18 or over,
(b) is of data concerning health,
(c) is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) In this paragraph, “individual at economic risk” means an individual who is less able to protect his or her economic well-being by reason of physical or mental injury, illness or disability.”—(Victoria Atkins.)
Part 2 of Schedule 1 describes types of processing of special categories of personal data which meet the requirement in Article 9(2)(g) of the GDPR (processing necessary for reasons of substantial public interest) for a basis in UK law (see Clause 10(3)). This amendment adds to Part 2 of Schedule 1 certain processing of personal data which is necessary to protect the economic well-being of adults who are less able to protect their economic well-being by reason of a physical or mental injury, illness or disability.
The Under-Secretary of State will know better than anybody that there are very tight time limits over the windows within which people can ask for entry clearance officer reviews or reconsideration, either by an immigration official or, in extremis, by the Minister. How long will the pause last, and can she guarantee the Committee today that the pause will never jeopardise the kick-in of time limits on an appeal or a reconsideration decision?
The reason for the pause is—I will give case studies of this—to enable the immigration system to operate. If someone has gone missing, requests for data will be required to find that person. Once that person is found, and there is no longer a need to apply the exemption, it will be lifted.
That is not an answer to my question. I am asking for a guarantee to the Committee this afternoon that the pause will never jeopardise somebody’s ability to submit a valid request for a reconsideration or an appeal with the information that they need within the time windows set out by Home Office regulations—yes or no.
I am asked whether this will have an impact on someone’s application, either at appeal or reconsideration. Of course, information is obtained so that a person can be brought in. As I say, I will make it clear with case studies, so perhaps I can answer the right hon. Gentleman in more detail when I give such an example, but the purpose of this is generally to find a person. When the need, as set out under the exemption, no longer exists, the rights kick back in again. This relates only to the first two data protection principles under the GDPR. Again, I will go into more detail in a moment, but this is not the permanent exemption from rights as perhaps has been feared by some; it is simply to enable the process to work. Once a person has been brought into the immigration system, all the protections of the immigration system remain.
I will move on to the case studies in a moment, as I have given way several times. First, I will lay out the titles, then I will come on to article 23. Again, our analysis is that the provision fits within one of the exemptions in article 23. That is precisely the reason that we have drawn it in this way.
We very much welcome the enhanced rights and protections for data subjects afforded by the GDPR. The authors of the GDPR accepted that at times those rights need to be qualified in the general public interest, whether to protect national security, the prevention and detection of crime, the economic interests of the country or, in this case, the maintenance of an effective system of immigration control. Accordingly, a number of articles of the GDPR make express provision for such exemptions, including article 23(1)(e), which enables restrictions to be placed on certain rights of data subjects. Given the extension of data subjects’ rights under the GDPR, it is necessary to include in the Bill an explicit targeted but proportionate exemption in the immigration context.
The exemption would apply to the processing of personal data by the Home Office for the purposes of
“the maintenance of effective immigration control, or…the investigation or detection of activities that would undermine the maintenance of effective immigration control”.
It would also apply to other public authorities required or authorised to share information with the Department for either of those specific purposes.
Let me be clear on what paragraph 4 of schedule 2 does not do. It categorically does not set aside the whole of the GDPR for all processing of personal data for all immigration purposes. It makes it clear that the exemption applies only to certain GDPR articles. The articles that the exemption applies to are set out in paragraph 4(2) of schedule 2. They relate to various rights of data subjects provided for in chapter 3 of the GDPR, such as the rights to information and access to personal data, and to two of the data protection principles—namely the first one, which relates to fair and transparent processes, and the purpose limitation, which is the second one.
As I understand it, the derogations that are sought effectively remove the right to information in article 13; the right to information where data is obtained from a third party in article 14; the right of subjects’ access in article 15; the right to erasure in article 17; the right to restriction of processing in article 18; the right to object in article 21(1); the principle of lawful, fair and transparent processing in article 5; the principle of purpose limitation in article 5(1)(b); and the data protection principles in article 5 of lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, confidentiality and accountability to the extent that they correspond to the rights above. That is a pretty broad set of rights to be cast out.
Those are not the data protection principles. If one continues to read on to paragraph 4(2)(b) of schedule 2, it sets out the two data protection principles that I have just highlighted. The provisions set out in sub-paragraph (2)(a) relate to the data protection principles of fair and transparent processing and the purpose limitation. As I say, this is not a permanent removal. This is, as we describe it, a pause. There is not a free hand to invoke the permitted exception as a matter of routine.
All of the data protection principles, including those relating to data minimisation, accuracy, storage limitation and integrity and confidentiality, will continue to apply to everyone. So, too, will all the obligations on data controllers and processors, all the safeguards around cross-border transfers, and all the oversight and enforcement powers of the Information Commissioner. The latter is particularly relevant here, as it is open to any data subject affected by the provisions in paragraph 4 of schedule 2 to make a complaint to the Information Commissioner that the commissioner is then under a duty to investigate. Again, I hope that that addresses some of the concerns that the hon. Member for Argyll and Bute raised.
Contrary to the impression that has perhaps been given or understood, paragraph 4 does not give the Home Office a free hand to invoke the permitted exceptions as a matter of routine. The Bill is clear that the exceptions may be applied only to the extent that the application of the rights of data subjects, or the two relevant data protection principles, would be likely to prejudice
“the maintenance of effective immigration control, or…the investigation or detection of activities that would undermine the maintenance of effective immigration control”.
That is an important caveat.
The Minister will know that in paragraph 2(1)(a) we already have a set of exemptions that relate to the prevention or detection of a crime, including, presumably, all of the crimes that fall into the bucket of organising or perpetrating illegal immigration. Despite constant pressing during the debate in the other place and here, we have not yet had a clear answer as to why additional powers and exemptions are needed, over and above the powers expressly granted and agreed in paragraph 2(1)(a).
I am grateful to the right hon. Gentleman for raising that issue, because it allows me to get to the nub of how we approach the immigration system. We do not see the immigration system as some form of criminality or as only being open to the principles of criminal law. He will know that we deal with immigration in both the civil law and criminal law contexts. The exemption he has raised in terms of paragraph 2 of the schedule deals with the criminal law context, but we must also address those instances where the matter is perhaps for civil law.
We know that in the vast majority of immigration cases, people are dealt with through immigration tribunals or through civil law. They are not dealt with through criminal law. That is the point; we must please keep open the ability to deal with people through the civil law system, rather than rushing immediately to criminalise them. If, for example, they have overstayed, sometimes it is appropriate for the criminal law to become involved, but a great number of times it is for the civil law to be applied to deal with that person’s case either by way of civil penalty or by finding an arrangement whereby they can be given discretion to leave or the right to remain. We have the exemption in paragraph 4 so that we do not just focus on the criminal aspects that there may be in some immigration cases. We must ensure that we also focus on the much wider and much more widely used civil law context.
It is important to recognise that the exemptions will not and cannot be targeted at whole classes of vulnerable individuals, be they victims of domestic abuse or human trafficking, undocumented children or asylum seekers. The enhanced data rights afforded by the GDPR will benefit all those who are here lawfully in the United Kingdom, including EU citizens. The relevant rights will be restricted only on a case-by-case basis where there is evidence that the prejudice I have mentioned is likely to occur.
If someone has overstayed, they have committed a crime. Therefore, paragraph 2(1)(a) absolutely bites. We are seeking to prevent that crime. Someone who has overstayed their visa has committed a crime. It is kind of as simple as that.
In that scenario, we may well effect their removal administratively. It does not mean that it is going through the criminal courts.
By way of a second example, take a case where the Home Office is considering an application for an extension of leave to remain in the UK. It may be that we have evidence that the applicant has provided false information to support his or her claim. In such cases, we may need to contact third parties to substantiate the veracity of the information provided in support of the application. If we are then obliged to inform the claimant that we are taking such steps, they may abscond and evade detection.
If someone has submitted false information in support of an application to the Government, and signed it, as they must, that is called fraud. That is also a crime, and is covered by paragraph 2(1)(a).
I take the right hon. Gentleman’s point, particularly in relation to the overstayer, but as the purpose of processing personal data in many immigration areas is not generally the pursuit of criminal enforcement action, it is not clear that it would be appropriate in all cases to rely on crime-related exemptions, where the real prejudice lies in our ability to take administrative enforcement action. It may well be that in some cases a crime has been committed, but that will not always be the case.
Criminal sanctions are not always the correct and proportionate response to people who are in the UK without lawful authority. It is often better to use administrative means to remove such a person and prevent re-entry, rather than to deploy the fully panoply of the criminal justice system, which is designed to rehabilitate members of our communities. As the purpose of processing personal data in such cases is not generally the pursuit of a prosecution, it is not clear that we could, in all cases, rely on that exemption relating to crime.
If I may, I will continue with my speech, because I have more information to give. Perhaps at the end I can deal with the hon. Gentleman’s point.
I just want to dissolve one confusion in the Minister’s remarks. The nature of the Home Office response, whether it is a prosecution through a civil court, a civil sanction or a civil whatever else, does not affect the nature of the offence that is committed. The Home Office has a range of sanctions and choices in responding to an offence, but that does not stop the offence being an offence. The offence is still a crime, and is therefore covered by paragraph 2(1)(a).
The right hon. Gentleman is assuming that each and every immigration case that will be covered by these provisions necessitates the commission of a crime.
I would not make that assumption. The vast majority of immigration cases are dealt with in a civil context.
No—the child is not missing, but the parent is; so we seek advice from the Department for Education about where the child is. It may be that cleverer lawyers than me in the Home Office will find an exemption for that, but the point of this exemption of paragraph 4 is to cover the lawfulness of the Home Office in seeking such information in order to find parents or responsible adults who may have responsibility, and either to regularise their stay or to remove them.
I encourage the right hon. Member for Birmingham, Hodge Hill to withdraw his amendment, as we believe that it is not the wholesale disapplication of data subjects’ rights, and it is a targeted provision wholly in accordance with the discretion afforded to member states by the GDPR and is vital to maintaining the integrity and effectiveness of our immigration system.
Anyone who was not alarmed by this provision certainly will leave this Committee Room thoroughly alarmed by the Minister’s explanations.
First, we were invited to believe that we could safeguard due process and the rights of newcomers to this country by suspending those rights and pursuing people through civil court. We were then asked to believe that the Home Office’s ambition to deal with these cases with civil response rendered inoperable the powers set out in paragraph 2(1)(a), confusing the response from the Home Office and the nature of the offence committed up front. Then, we were invited to believe that this was not a permanent provision—even though that safeguard is not written into the Bill—but a temporary provision. What is not clear is when those temporary provisions would be activated and, crucially, when they would be suspended.
I am happy to give way in a moment. Most of us here who have done our fair share of immigration cases—I have done several thousand over the last 14 years—know that on some occasions, the Home Office interpretation of time is somewhat different from a broadly understood interpretation of time. I have cases in which a judge has ordered the issue of a visa, and six months later we are still chasing the Home Office for the issue of the visa. I will not be alone in offering these examples.
Perhaps when the Minister intervenes, she could set out what “temporary” means, where it is defined and where are the limits, and she still has not answered my question whether she will guarantee that the implementation of this pause will not jeopardise someone’s ability to submit either a request for an entry clearance officer review or an appeal within the legally binding time windows set out in Home Office regulations.
The key to this is the purpose for which we are processing the data. Even if there are criminal sanctions, that does not mean that we are processing for that purpose, particularly where we are not likely to pursue a prosecution. The primary purpose is often immigration control—that does not fit under paragraph 2 as he has described it—rather than enforcing the criminal justice system. That is the point. It is for the purpose of processing the data. The crime-related provisions in the Bill refer to the importance of identifying the purposes of the processing. Where the primary purpose is immigration related, it is not clear that we could rely on the crime-related exemptions. That is why paragraph 4 is in the schedule.
I am really sorry to have to say this, but that is utter nonsense. The idea that the Home Office will seek to regularise someone’s immigration status by denying them access to information that might support their case is, frankly, fanciful.
This is not a new debate; we last had it in 1983. The Home Office tried to sketch this exemption into legislation then, it failed, and we should not allow the exemption to go into the Bill, especially given that all the explanations we have heard this afternoon are about cases where paragraph 2(1)(a), or the safeguarding provisions drafted by the Government, would provide the necessary exemptions and safeguards in the contingencies that the Minister is concerned about.
My hon. Friend is bang on the money, but perhaps the Under-Secretary can enlighten us.
All rights are reinstated once the risk to prejudice is removed. The wording is in line 35 of paragraph 4:
“to the extent that the application of those provisions would be likely to prejudice any of the matters mentioned in paragraphs (a) and (b).”
To reassure the hon. Member for Bristol North West, that is the end point.
I am grateful to the Under-Secretary for clarifying a point that was not at issue. No one is concerned about what rights kick back in at the end of a process. We are worried about how long the process will last, who will govern it, what rights newcomers to this country or courts will have to enforce some kind of constraint on the process and how we will stop the Home Office embarking on unending processes in a Jarndyce v. Jarndyce-like way, which we know is the way these cases are sometimes prosecuted. The Home Office is full of some of the most amazing civil servants on earth, but perhaps, a little like the Under-Secretary, they are sometimes good people trapped in bad systems and, dare I say it, bad arguments.
Question put, That the amendment be made.