(6 years, 4 months ago)
Commons ChamberI am very happy to join my hon. Friend in commending the courage and bravery shown by those cave rescuers in saving lives: Robert Harper, Chris Jewell, Jason Mallison and Tim Acton. This whole House commends them.
Last week, a much loved grandmother, Riasat Bi, was murdered in her own home during a knife fight; she was 86. West Midlands police are doing everything they can to respond to the growing spiral of violence in east Birmingham, but they need help. The force is at its smallest size since 1974: it needs new investment and we need new investment in youth services. Will the Home Secretary listen to our experience in east Birmingham as he prepares his bid for the Budget later this year?
The right hon. Gentleman rightly raises an important issue, and it reminds the whole House how much more needs to be done to fight the rise in serious violence that we are seeing. Our serious violence strategy is dealing with much of that; it will take time as the issues are complex, but it is right that we work more closely with West Midlands police to see what more we can do.
(6 years, 8 months ago)
Public Bill CommitteesI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
New clause 8—Application of the Equality Act (Employment)—
“(1) Part 5 (Employment) of the Equality Act (‘the Equality Act’) shall apply to the processing of personal data by an algorithm or automated system in making or supporting a decision under this section.
(2) A ‘decision’ in this section means a decision that engages a data subject (D)’s rights, freedoms or legitimate interests concerning—
(a) recruitment,
(b) the terms and conditions of employment,
(c) access to opportunities for promotion, transfer or training, and
(d) dismissal.
(3) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Act, the Equality Act or in any other primary or secondary legislation relating to D’s personal data, employment, social security or social protection.”
This new clause would apply Part 5 of the Equality Act 2010 to the processing of personal data by an algorithm or automated system or supporting a decision under this new clause.
New clause 9—Right to algorithmic fairness at work—
“(1) A person (“P”) has the right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision under this section.
(2) A “decision” in this section means a decision in which an algorithm or automated system is deployed to support or make a decision or any part of that decision that engages P’s rights, freedoms or legitimate interests concerning—
(a) recruitment,
(b) the terms and conditions of employment,
(c) access to opportunities for promotion, transfer or training, and
(d) dismissal.
(3) “Fair treatment” in this section means equal treatment between P and other data subjects relevant to the decision made under subsection (2) insofar as that is reasonably practicable with regard to the purpose for which the algorithm or automated system was designed or applied.
(4) In determining whether treatment of P is “fair” under this section the following factors shall be taken into account—
(e) the application of rights and duties under equality and other legislation in relation to any protected characteristics or trade union membership and activities,
(f) whether the algorithm or automated system has been designed and trained with due regard to equality of outcome,
(g) the extent to which the decision is automated,
(h) the factors and weighting of factors taken into account in determining the decision,
(i) whether consent has been sought for the obtaining, recording, using or disclosing of any personal data including data gathered through the use of social media, and
(j) any guidance issued by the Centre for Data Ethics and Innovation.
(5) “Protected characteristics” in this section shall be the protected characteristics defined in section 4 of the Equality Act 2010.”
This new clause would create a right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision regarding recruitment, terms and conditions of employment, access to opportunities for promotion etc. and dismissal.
New clause 10—Employer’s duty to undertake an Algorithmic Impact Assessment—
‘(1) An employer, prospective employer or agent must undertake an assessment to review the impact of deploying the algorithm or automated system in making a decision to which subsection (1) of section [Application of Equality Act (Employment)] applies [an ‘Algorithmic Impact Assessment’].
(2) The assessment undertaken under subsection (1) must—
(a) identify the purpose for which the algorithm or automated system was designed or applied,
(b) test for potential discrimination or other bias by the algorithm or automated system,
(c) consider measures to advance fair treatment of data subjects relevant to the decision, and
(d) take into account any tools for Algorithmic Impact Assessment published by the Centre for Data Ethics and Innovation.”
This new clause would impose a duty upon employers to undertake an Algorithmic Impact Assessment.
New clause 11—Right to an explanation—
“(1) A person (“P”) may request and is entitled to be provided with a written statement from an employer, prospective employer or agent giving the following particulars of a decision to which subsection (1) of section [Right to algorithmic fairness at work] applies—
(a) any procedure for determining the decision,
(b) the purpose and remit of the algorithm or automated system deployed in making the decision,
(c) the criteria or other meaningful information about the logic involved in determining the decision, and
(d) the factors and weighting of factors taken into account in determining the decision.
(2) P is entitled to a written statement within 14 days of a request made under subsection (1).
(3) A complaint may be presented to an employment tribunal on the grounds that—
(a) a person or body has unreasonably failed to provide a written statement under subsection (1),
(b) the particulars given in purported compliance with subsection (1) are inadequate,
(c) an employer or agent has failed to comply with its duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment],
(d) P has not been treated fairly under section [Right to algorithmic fairness at work].
(4) Where an employment tribunal finds a complaint under this section well-founded the tribunal may—
(e) make a declaration giving particulars of unfair treatment,
(f) make a declaration giving particulars of any failure to comply with duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],
(g) make a declaration as to the measures that ought to have been undertaken or considered so as to comply with the requirements of subsection (1) or section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],
(h) make an award of compensation as may be just and equitable.
(5) An employment tribunal shall not consider a complaint presented under subsection (3) in a case where the decision to which the reference relates was made—
(i) before the end of the period of 3 months, or
(j) within such further period as the employment tribunal considers reasonable in a case where it is satisfied that it was not reasonably practicable for the application to be made before the end of that period of 3 months.
(6) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Bill or any other primary or secondary legislation relating to P’s personal data, employment, social security or social protection.”
This new clause would create a right to an explanation in writing from an employer, prospective employer or agent giving the particulars of a decision to which the Right to algorithmic fairness at work applies.
New clauses 7 and 8 to 11 touch on the question of how we ensure a degree of justice when it comes to decisions that are taken about us automatically. The growth in decisions that are made through automated decision making has been exponential, and there are risks to that. We need to ensure that the law is modernised to provide new protections and safeguards for our constituents in this new world.
I should say at the outset that this group of new clauses is rooted in the excellent work of the Future of Work commission, which produced a long, thought-provoking report. The Committee will be frustrated to hear that I am not going to read through that this afternoon, but, none the less, I want to tease out a couple of points.
The basket of new clauses that we have proposed are well thought through and have been carefully crafted. I put on record my thanks to Helen Mountfield QC, an expert in equality law, and to Mike Osborne, professor of machine learning. Along with Ben Jaffey QC, a specialist in data law, they have been looking at some of the implications of automated decision making, which were discussed at length by the Future of Work commission.
Central to the new clauses is a concern that unaccountable and highly sophisticated automated or semi-automated systems are now making decisions that bear on fundamental elements of people’s work, including recruitment, pay and discipline. Just today, I was hearing about the work practices at the large Amazon warehouse up in Dundee, I think, where there is in effect digital casualisation. Employees are not put on zero-hours contracts, but they are put on four-hour contracts. They are guided around this gigantic warehouse by some kind of satnav technology on a mobile phone, but the device that guides them around the warehouse is also a device that tracks how long it takes them to put together a basket.
That information is then arranged in a nice league table of employees of who is the fastest and who is slowest, and decisions are then taken about who gets an extension to their contracted hours each week and who does not. That is a pretty automated kind of decision. My hon. Friend the Member for Eltham (Clive Efford) was describing to me the phenomenon of the butty man—the individual who decided who on a particular day got to work on the docks or on the construction site. In the pub at the end of the week, he divvied up the earnings and decided who got what, and who got work the following week. That kind of casualisation is now being reinvented in a digital era and is something that all of us ought to be incredibly concerned about.
What happens with these algorithms is called, in the jargon, socio-technical—what results is a mixture of conventional software, human judgment and statistical models. The issue is that very often the decisions that are made are not transparent, and are certainly not open to challenge. They are now quite commonly used by employers and prospective employers, and their agents, who are able to analyse very large datasets and can then deploy artificial intelligence and machine learning to make inferences about a person. Quite apart from the ongoing debates about how we define a worker and how we define employment—the subject of a very excellent report by my old friend Matthew Taylor, now at the RSA—there are real questions about how we introduce new safeguards for workers in this country.
I want to highlight the challenge with a couple of examples. Recent evidence has revealed how many recruiters use—surprise, surprise—Facebook to seek candidates in ways that routinely discriminate against older workers by targeting advertisements for jobs in a particular way. Slater and Gordon, which is a firm of excellent employment lawyers, showed that about one in five company executives admit to unlawful discrimination when advertising jobs online. The challenge is that when jobs are advertised in a targeted way, by definition they are not open to applicants from all walks of life, because lots of people just will not see the ads.
Women and those over the age of 50 are now most likely to be prevented from seeing an advert. Some 32% of company executives say that they have discriminated against those who are over 50, and a quarter have discriminated in that way against women. Nearly two thirds of executives with access to a profiling tool have said that they use it to actively seek out people based on criteria as diverse as age, gender and race. If we are to deliver a truly meritocratic labour market, where the rights of us all to shoot for jobs and to develop our skills and capabilities are protected, some of those practices have to stop. If we are to stop them, the law needs to change, and it needs to change now.
This battery of new clauses sets out to do five basic things. First, they set out some enhancements and refinements to the Equality Act 2010, in a way that ensures that protection from discrimination is applied to new forms of decision making, especially when those decisions engage core rights, such as rights on recruitment, terms of work, or dismissal. Secondly, there is a new right to algorithmic fairness at work, to ensure equal treatment. Thirdly, there is the right to an explanation when a decision is taken in a way that affects core elements of work life, such as a decision to hire, fire or suspend someone. Fourthly, there is a new duty for employers to undertake an algorithmic impact assessment, and fifthly, there are new, realistic ways for individuals to enforce those rights in an employment tribunal. It is quite a broad-ranging set of reforms to a number of different parts of legislation.
My right hon. Friend is making a powerful case. Does he agree that this is exactly the kind of thing we ought to have been discussing at the outset of the Bill? The elephant in the room is that the Bill seems to me, overall, to be looking backwards rather than forwards. It was developed to implement the general data protection regulation, which has been discussed over many years. We are seeing this week just how fast-moving the world is. These are the kind of ideas that should have been driving the Bill in the first place.
Exactly. My hon. Friend makes such a good point. The challenge with the way that Her Majesty’s Government have approached the Bill is that they have taken a particular problem—that we are heading for the exit door of Europe, so we had better ensure that we get a data-sharing agreement in place, or it will be curtains for Britain’s services exports—and said, “We’d better find a way of incorporating the GDPR into British law as quickly as possible.” They should have thought imaginatively and creatively about how we strengthen our digital economy, and how we protect freedoms, liberties and protections in this new world, going back to first principles and thinking through the consequences. What we have is not quite a cut-and-paste job—I will not describe it in that way—but neither is it the sophisticated exercise in public law making that my hon. Friend describes as more virtuous.
I want to give the Committee a couple of examples of why this is so serious, as sometimes a scenario or two can help. Let us take an individual whom we will call “Mr A”. He is a 56-year-old man applying for website development roles. Typically, if someone is applying for jobs in a particular sector, those jobs will be advertised online. In fact, many such roles are advertised only online, and they target users only in the age profile 26 to 35, through digital advertising or social media networks, whether that is Facebook, LinkedIn, or others. Because Mr A is not in the particular age bracket being targeted, he never sees the ad, as it will never pop up on his news feed, or on digital advertising aimed at him. He therefore does not apply for the role and does not know he is being excluded from applying for the role, all as a consequence of him being the wrong age. Since he is excluded from opportunities because of his age, he finds it much harder to find a role.
The Equality Act, which was passed with cross-party consensus, prohibits less favourable treatment because of age—direct discrimination—including in relation to recruitment practices, and protects individuals based on their age. The Act sets out a number of remedies for individuals who have been discriminated against in that way, but it is not clear how the Bill proposes to correct that sin. Injustices in the labour market are multiplying, and there is a cross-party consensus for a stronger defence of workers. In fact, the Member of Parliament for the town where I grew up, the right hon. Member for Harlow (Robert Halfon), has led the argument in favour of the Conservative party rechristening itself the Workers’ party, and the Labour party was founded on a defence of labour rights, so I do not think this is an especially contentious matter. There is cross-party consensus about the need to stand up for workers’ rights, particularly when wages are stagnating so dramatically.
We are therefore not divided on a point of principle, but the Opposition have an ambition to do something about this growing problem. The Bill could be corrected in a way that made a significant difference. There is not an argument about the rights that are already in place, because they are enshrined in the Equality Act, with which Members on both sides of the House agree. The challenge is that the law as it stands is deficient and cannot be applied readily or easily to automated decision making.
My right hon. Friend is making a powerful case about the importance of the Equality Act in respect of the Bill, but may I offer him another example? He mentioned the Amazon warehouse where people are tracked at work. We know that agencies compile lists of their more productive workers, whom they then use in other work, and of their less productive workers. That seems like a form of digital blacklisting, and we all know about the problems with blacklisting in the construction industry in the 1980s. I suggest that the new clauses are a great way of combating that new digital blacklisting.
My hon. Friend gives a brilliant example. The point is that employment agencies play an incredibly important role in providing workers for particular sectors of the economy, from hotels to logistics, distribution and construction. The challenge is that the areas of the economy that have created the most jobs in the 10 years since the financial crash are those where terms and conditions are poorest, casualisation is highest and wages are lowest—and they are the areas where productivity is poorest, too. The Government could take a different kind of labour market approach that enhanced productivity and wages, and shut down some of the bad practices and casualisation that are creating a problem.
As it happens, the Government have signed up to some pretty big ambitions in that area. Countries around the world recently signed up to the UN sustainable development goals. Goal 8 commits the Government to reducing inequality, and SDG 10 commits them to reducing regional inequality. However, when I asked the Prime Minister what she was doing about that, my question was referred to Her Majesty’s Treasury and the answer that came back from the Chancellor was, “We believe in raising productivity and growth.” The way to raise productivity and growth is to ensure that there are good practices in the labour market, because it is poor labour market productivity that is holding us back as a country.
If digital blacklisting or casualisation were to spread throughout the labour market in the sectors that happen to be creating jobs, there would be no increase in productivity and the Government would be embarked on a self-defeating economic policy. Although these new clauses may sound technical, they have a bearing on a much more important plank of the Government’s economic development strategy.
Our arguments are based on principles that have widespread support on both sides of the House and they are economically wise. The consequences of the new clauses will be more than outweighed by the benefits they will deliver. I commend them to the Minister and I hope she will take them on board.
Thank you, Mr Streeter, and what a wonderful birthday present it is to be serving on the Committee.
It is a joy, actually, to be able to agree with the Opposition on the principle that equality applies not only to decisions made by human beings or with human input, but to decisions made solely by computers and algorithms. On that, we are very much agreed. The reason that we do not support the new clauses is that we believe that the Equality Act already protects workers against direct or indirect discrimination by computer or algorithm-based decisions. As the right hon. Member for Birmingham, Hodge Hill rightly said, the Act was passed with cross-party consensus.
The Act is clear that in all cases, the employer is liable for the outcome of any of their actions, or those of their managers or supervisors, or those that are the result of a computer, algorithm or mechanical process. If, during a recruitment process, applications from people with names that suggest a particular ethnicity were rejected for that reason by an algorithm, the employer would be liable for race discrimination, whether or not they designed the algorithm with that intention in mind.
The right hon. Gentleman placed a great deal of emphasis on advertising and, again, we share his concerns that employers could seek to treat potential employees unfairly and unequally. The Equality and Human Rights Commission publishes guidance for employers to ensure that there is no discriminatory conduct and that fair and open access to employment opportunities is made clear in the way that employers advertise posts.
The same principle applies in the provision of services. An automated process that intentionally or unintentionally denies a service to someone because of a protected characteristic will lay the service provider open to a claim under the Act, subject to any exceptions.
I am grateful to the Minister for giving way, not least because it gives me the opportunity to wish her a happy birthday. Could she remind the Committee how many prosecutions there have been for discriminatory advertising because employers chose to target their adverts?
If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.
The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.
When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.
We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.
I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.
We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.
We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.
Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.
I think it was in “Candide” that Voltaire introduced us to the word “Panglossian”, and we have heard a rather elegant and Panglossian description of a perfect world in which all is fine in the labour market. I am much more sceptical than the Minister. I do not think the current law is sufficiently sharp, and I am concerned that the consequence of that will be injustice for our constituents.
The Minister raised a line of argument that it is important for us to consider. The ultimate test of whether the law is good enough must be what is actually happening out there in the labour market. I do not think it is good enough; she thinks it is fine. On the nub of the argument, a few more facts might be needed on both sides, so we reserve the right to come back to the issue on Report. This has been a useful debate. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 13
Review of Electronic Commerce (EC Directive) Regulations
“(1) The Secretary of State shall lay before both Houses of Parliament a review of the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of personal data.
(2) A review under subsection (1) shall be laid before Parliament by 31 January 2019.”—(Liam Byrne.)
This new clause would order the Secretary of State to review the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of data and lay that review before Parliament before 31 January 2019.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
This is not normally my practice, but let me raise another area that is subject to a measure of cross-party consensus. There is widespread recognition that the e-commerce directive, which is used to regulate information services providers, is hopelessly out of date. It was agreed in around 2000. In effect, it allows information services providers to be treated as platforms rather than publishers. Since then, we have seen the growth of big tech and the new data giants that now dominate the digital economy, and they are misbehaving. Worse, they have become platforms for hate speech, social division and interference in democracy. It was intriguing to hear Mark Zuckerberg himself admit in the interview he gave yesterday that Facebook was indeed being used to try to corrupt elections. That is an extraordinary recognition by the head of one of the most important firms in the world.
The Secretary of State for Digital, Culture, Media and Sport reminded us as recently as this morning that as we come out of the European Union we will have a new opportunity to update the e-commerce directive. The House basically must put in place a new framework to regulate information services providers in a new way. A debate is raging among our neighbours about what steps we need to take to shut down the hate speech that is dividing communities, and we need to get into that debate quickly. Germany recently passed laws that require companies such as Facebook to take down hate speech in a very short time window or face fines of up to €10 million and Ireland has created a new regulator to provide a degree of overwatch, so it is intriguing that we are falling behind some of our most important neighbours, who now lead this debate.
I began looking at this issue when I started researching new techniques in ISIS propaganda. In the excellent Scotland Yard counter-terrorism referral unit, I saw propaganda that was put together with the slickness of a pop video to incite people to commit the most heinous crimes, such as the one we commemorate today. Yet I think we all recognise that organisations such as Facebook and YouTube are simply not working quickly enough to take down that kind of material, which we simply do not want people to see. I congratulate The Times, which has run a forensic campaign to shine a light on some of that bad practice. It is good finally to see advertisers such as Unilever beginning to deliver a tougher message to social media platforms that enough is enough.
We know we have to modernise those regulations. The commercial world and politicians on both sides are saying, “Enough is enough.” We all fear the consequences of things going wrong with respect to the destabilisation of democracy in America—but not just in America. We have seen it across the Baltics, in France, in Germany, across southern Europe and in eastern Europe. Among our NATO allies, we can see a vulnerability to our enemies using social media platforms to sow division.
I admire the Minister’s concern and ambition for administrative tidiness. She reminds me of an old quote by Bevin, who said once, “If you are a purist, the place for you is not a Parliament; it is a monastery.”
In the case of the Minister, a nunnery, although Bevin was less enlightened than the hon. Lady. Here is a Bill; here is a new clause; the new clause is within scope. The object of the new clause is to deliver a Government objective, yet it is rejected. That is hard logic to follow. We have had the tremendous assurance, however, that there will be nothing less than a code of practice, so these huge data giants will be shaking in their boots in California, when they wake up. They will be genuinely concerned and no doubt already planning how they can reform their ways and stop the malpractice that we have grown all too used to. I am afraid that these amount to a collection of warm words, when what the country needs is action. With that in mind, I will push the new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
This is another entirely sensible new clause, which I hope the Government will take on board, either at this stage or on Report. We rehearsed earlier in Committee the debate about the reality and challenges of the fact that our education providers are now collecting, managing and often losing significant amounts of very personal data relating to children.
Any of us who has children at school will know the joys of ParentPay, which means that schools are collecting biometric data on our children. We know that schools are keeping exam results and all kinds of records and evaluations about our children online. Given the complexity of the GDPR and some of the costs and questions around implementing it, the complexity of the education system means that we urgently need a code of practice that schools can draw on to help them get the GDPR right, and to help our educators in their task of keeping our children’s data safer than it is today.
In my argument, I will draw on the excellent contribution made on Second Reading by my noble Friend, Lord Knight, who said:
“Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting”
and so on. My noble Friend has long been an advocate of that kind of thing, but the point is that he knows, and the other place recognised, that the way school information systems operate means they are often cloud based and integrated into all sorts of other data systems. There will often be contracts in place with all sorts of education service providers, which will entail the transfer of data between, for example, a school and a third party. It could well be that that third party is based overseas. As my noble Friend said:
“Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 185.]
Lord Storey rode in behind my noble Friend, saying that
“young people probably need more protection than at any other time in our recent history.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 170.]
That is not something that has been debated only by the other place. UNICEF recently published a working paper entitled “Privacy, protection of personal information and reputation rights” and said it was now
“evident that children’s privacy differs both in scope and application from adults’ privacy”
but that they experience more threats than any other group. The “Council of Europe Strategy for the Rights of the Child (2016-2021)” echoed the same sentiment and observed:
“Parents and teachers struggle to keep up with technological developments”.
I have a number of friends who are teachers and headteachers. They listen to me in horror when I explain that I am the shadow Minister for the Data Protection Bill, because they know this is looming and they are absolutely terrified of it. Why is that? Because they are good people and good educators; they go into teaching because they want to change the world and change children’s lives, and they recognise the new obligations that are coming, but they also recognise the realities of how their schools operate today. Those people know about the proliferation of data that they and their staff are collecting. They know about the dangers and risks of that data leaking—not least because most teachers I know who have some kind of pastoral care responsibility seem to spend half their time having to advise their children about what not to do with social media apps and what not to post. They are often drawn in to disputes that rage out of control on social media platforms such as Instagram.
Teachers are very alert to the dangers of this new world. They are doing a brilliant and innovative job of supporting children through it, but they are crying out now for good guidance to help them to implement the GDPR successfully.
I echo my right hon. Friend’s points. My daughter is seven years old. I have an app on my phone that, at any time of the day, will tell me what she is doing at school. Her attendance, reward system, and school meal requirements are all recorded on it, and I can access it at any time. The school she goes to wants to keep a connection with parents, so that parents can interact comfortably. The new clause would go a long way towards allowing schools to keep that link, because the default position of schools, as I am sure my right hon. Friend would agree, is to protect children, even if that means not sharing information in the way that they would like to.
That sounds like a terrifying application; my hon. Friend’s daughter very much has my sympathies. He is absolutely right. Lord Knight made this point with such power in the other place. The technology is advancing so quickly, and schools know that if they can monitor things in new, more forensic ways, that helps them to do their job of improving children’s education. However, it has costs and consequences too. I hope that Her Majesty’s Government will look sympathetically on the task of teachers, as they confront this 200-and-heaven-knows-what-page Bill.
Does my right hon. Friend share my concerns that, in response to a number of written parliamentary questions that I tabled, it became clear that the Government gave access to the national pupil database, which is controlled by the Government, to commercial entities, including newspapers such as The Daily Telegraph?
Yes. My hon. Friend has done an extraordinary job of exposing that minor scandal. I am surprised that it has not had more attention in the House, but hopefully once the Bill has passed it is exactly the kind of behaviour that we can begin to police rather more effectively.
I am sure that Ministers will recognise that there is a need for this. No doubt their colleagues in the Department for Education are absolutely all over it. I was talking to a headteacher in the Minister’s own constituency recently—an excellent headteacher, in an excellent school, who is a personal friend. The horror with which headteachers regard the arrival of the GDPR is something to behold. Heaven knows, our school leaders and our teachers have enough to do. I call on Ministers to make their task, their lives, and their mission that bit easier by accepting the new clause.
Our schools handle large volumes of sensitive data about the children they educate. Anyone who has any involvement with the education system, either personally through their families, on their mobile phone apps, or in a professional capacity as constituency MPs, is very conscious of the huge responsibilities that school leaders have in handling that data properly and well, and in accordance with the law. As data controllers in their own right, schools and other organisations in the education system will need to ensure that they have adequate data-handling policies in place to comply with their legal obligations under the new law.
Work is going on already. The Department for Education has a programme of advice and education for school-leaders, which covers everything from blogs, a guidance video, speaking engagements, and work to encourage system suppliers to be proactive in helping schools to become GDPR-compliant. Research is also being undertaken with parents about model privacy notices that will help schools to make parents and pupils more aware of the data about children used in the sector. The Department for Education is also shaping a toolkit that will bring together various pieces of guidance and best practice to address the specific needs of those who process education data. In parallel, the Information Commissioner has consulted on guidance specifically addressing issues about the fair and lawful processing of children’s data. Everyone is very alive to the issue of protecting children and their data.
At this point, the Government want to support the work that is ongoing—already taking place—and the provisions on guidance that are already in the Bill. Our concern is that legislating for a code now could be seen as a reason for schools to wait and see, rather than continuing their preparations for the new law. But it may be that in due course the weight of argument swings in favour of a sector-specific code of practice. That can happen. It does not have to be in the Bill. It can happen because clause 128 provides that the Secretary of State may require the Information Commissioner to prepare additional codes of practice for the processing of personal data, and the commissioner can issue further guidance under her own steam, using her powers under article 57 of the GDPR, without needing any direction from the Secretary of State.
I hope that the ongoing work reassures the right hon. Gentleman and that he will withdraw the new clause at this stage.
I am reassured by that and I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 17
Personal data ethics advisory board and ethics code of practice
‘(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board (“the board”).
(2) The board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are—
(a) to monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;
(b) to monitor the protection of the individual and collective rights and interests of data subjects in relation to their personal data;
(c) to ensure that trade-offs between the rights of data subjects and the use of management of personal data are made transparently, inclusively, and with accountability;
(d) to seek out good practices and learn from successes and failures in the use and management of personal data;
(e) to enhance the skills of data subjects and controllers in the use and management of personal data.
(3) The board must work with the Commissioner to prepare a data ethics code of practice for data controllers, which must—
(a) include a duty of care on the data controller and the processor to the data subject;
(b) provide best practice for data controllers and processors on measures, which in relation to the processing of personal data—
(i) reduce vulnerabilities and inequalities;
(ii) protect human rights;
(iii) increase the security of personal data; and
(iv) ensure that the access, use and sharing personal data is transparent, and the purposes of personal data processing are communicated clearly and accessibly to data subjects.
(4) The code must also include guidance in relation to the processing of personal data in the public interest and the substantial public interest.
(5) Where a data controller or processor does not follow the code under this section, the data controller or processor is subject to a fine to be determined by the Commissioner.
(6) The board must report annually to the Secretary of State.
(7) The report in subsection (6) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—
(a) monitoring and evaluating the use and management of personal data;
(b) sharing best practice and setting standards for data controllers; and
(c) clarifying and enforcing data protection rules.
(8) The Secretary of State must lay the report made under subsection (6) before both Houses of Parliament.
(9) The Secretary of State must, no later than one year after the day on which this Act receives Royal Assent, lay before both Houses of Parliament draft regulations in relation to the functions of the Personal Data Ethics Advisory Board as listed in subsections (2), (3), (4), (6) and (7) of this section.
(10) Regulations under this section are subject to the affirmative resolution procedure.’—(Darren Jones.)
This new clause would establish a statutory basis for a Data Ethics Advisory Board.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
New clause 17 is in my name and that of my right hon. Friend the Member for Birmingham, Hodge Hill. I do not take it personally that my other hon. Friends have not signed up to it; that was probably my fault for not asking them to do so in advance.
The new clause would bring a statutory footing to the data and artificial intelligence ethics unit, which I am very pleased that the Government have now funded and established, through the spring statement, in the Minister’s Department. It comes off the back of conversations with the Information Commissioner in Select Committee about the differing roles of enforcing legislation and of having a public debate about what is right and wrong and what the boundaries are in this ever-changing space. The commissioner was very clear that we need to have that debate with the public, but that it is not for her to do it. The ICO is an enforcer of legislation. The commissioner has a lot on her plate and is challenged by her own resource as it is. She felt that the new unit in the Department would be a good place to have the debate about technology ethics, and I support that assertion.
With no disrespect to any colleagues, I do not think that the House of Commons, and perhaps even the Select Committees to a certain extent, necessarily has the time, energy or resource to get into the real detail of some of the technology ethics questions, nor to take them out to the public, who are the people we need to be having the debate with.
The new clause would therefore establish in law that monitoring, understanding and public debate obligation that I, the ICO and others agree ought to exist in the new data ethics unit, but make it clear that enforcement was reserved for the Information Commissioner. I tabled the new clause because, although I welcome the Government’s commitment to the data and AI ethics unit, I feel that there is potential for drift. The new clause would therefore put an anchor in the technology ethics requirement of the unit so that it understands and communicates the ethical issues and does not necessarily get sidetracked into other issues, although it may seek to do that on top of this anchor. However, I think this anchor needs to be placed.
Also, I recognise that the Minister and the Secretary of State supported the recommendation made previously under the Cameron Government and I welcome that, but of course, with an advisory group within the Department, it may be a future Minister’s whim that they no longer wish to be advised on these issues, or it may be the whim of the Treasury—with, potentially, budget cuts—that it no longer wishes to fund the people doing the work. I think that that is not good enough and that putting this provision in the Bill would give some security to the unit for the future.
I will refer to some of the comments made about the centre for data ethics and innovation, which I have been calling the data and AI ethics unit. When it was first discussed, in the autumn Budget of November 2017, the Chancellor of the Exchequer said that the unit would be established
“to enable and ensure safe, ethical and ground-breaking innovation in AI and data-driven technologies. This world-first advisory body will work with government, regulators and industry to lay the foundations for AI adoption”.
Although that is a positive message, it says to me that its job is to lay the foundations for AI adoption. I agree with that as an aim, but it does not mean that at its core is understanding and communicating the ethical challenges that we need to try to understand and legislate for.
I move on to some of the documents from the recruitment advertising for personnel to run the unit from January of this year, which said that the centre will be at the centre of plans to make the UK the best place in the world for AI businesses. Again, that is a positive statement, but one about AI business adoption in this country, not ethical requirements. It also said that the centre would advise on ethical and innovative uses of data-driven tech. Again, that is a positive statement, but I just do not think it is quite at the heart of understanding and communicating and having a debate about the ethics.
My concern is that while all this stuff is very positive, and I agree with the Government that we need to maintain our position as a world leader in artificial intelligence and that it is something we need to be very proud of—especially as we go through the regrettable process of leaving the European Union and the single market, we need to hold on to the strengths we have in the British economy—this week has shown that there is a need for an informed public debate on ethics. As no doubt all members of the Committee have read in my New Statesman article of today, one of the issues we have as the voice of our constituents in Parliament is that in order for our constituents to understand or take a view on what is right or wrong in this quickly developing space, we all need to understand it in the first place—to understand what is happening with our data and in the technology space, to understand what is being done with it and, having understood it, to then to take a view about it. The Cambridge Analytica scandal has been so newsworthy because the majority of people understandably had no idea that all this stuff was happening with their data. How we legislate for and set ethical frameworks must first come from a position of understanding.
That is why the new clause sets out that there should be an independent advisory board. The use of such boards is commonplace across Departments and I hope that would not be a contentious question. Subsection (2) talks about some of the things that that board should do. The Minister will note that the language I have used is quite careful in looking at how the board should monitor developments, monitor the protection of rights and look out for good practice. It does not seek to step on the toes of the Information Commissioner or the powers of the Government, but merely to understand, educate and inform.
The new clause goes on to suggest that the new board would work with the commissioner to put together a code of practice for data controllers. A code of practice with a technology ethics basis is important because it says to every data controller, regardless of what they do or what type of work they do, that we require ethical boundaries to be set and understood in the culture of what we do with big data analytics in this country. In working with the commissioner, this board would add great value to the way that we work with people’s personal data, by setting out that code of practice.
I hope that the new clause adds value to the work that the Minister’s Department is already doing. My hope is that by adding it to the Bill—albeit that current Parliaments cannot of course bind their successors and it could be legislated away in the future—it gives a solid grounding to the concept that we take technology ethical issues seriously, that we seek to understand them properly, not as politicians or as busy civil servants, but as experts who can be out with our stakeholders understanding the public policy consequences, and that we seek to have a proper debate with the public, working with enforcers such as the ICO to set, in this wild west, the boundaries of what is and is not acceptable. I commend the new clause to the Committee and hope that the Government will support it.
I beg to ask leave to withdraw the new clause.
Clause, by leave, withdrawn.
New Clause 20
Automated number plate recognition (No. 2)
“(1) Vehicle registration marks captured by automated number plate recognition systems are personal data.
(2) The Secretary of State shall issue a code of practice in connection with the operation by the police of automated number plate recognition systems.
(3) Any code of practice under subsection (1) shall conform to section 67 of the Police and Criminal Evidence Act 1984.”—(Liam Byrne.)
This new clause requires the Secretary of State to issue a code of practice in connection with the operation by the police of automated number plate recognition systems, vehicle registration marks captured by which are to be considered personal data in line with the opinion of the Information Commissioner.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
I will touch on this new clause only very briefly, because I hope the Minister will put my mind at rest with a simple answer. For some time, there has been concern that the way in which data collected by the police through automatic number plate recognition technology is not adequately ordered, organised or policed by a code of practice. A code of practice is probably required to put the police well and truly within the boundaries of the Police and Criminal Evidence Act 1984, the Data Protection Act 1998 and the Bill.
With this new clause, we are basically asking the Secretary of State to issue a code of practice in connection with the operation by the police of ANPR systems under subsection (1), and we ask that it conform to section 67 of the Police and Criminal Evidence Act 1984. I hope the Minister will just say that a code of practice is on the way so we can safely withdraw the new clause.
I hope Committee members have had the chance to see my response to the questions of the hon. Member for Sheffield, Heeley on Tuesday about ANPR, other aspects of surveillance and other types of law enforcement activity.
I assure the right hon. Member for Birmingham, Hodge Hill that ANPR data is personal data and is therefore caught by the provisions of the GDPR and the Bill. We recognise the need to ensure the use of ANPR is properly regulated. Indeed, ANPR systems are governed by not one but two existing codes of practice. The first is the code issued by the Information Commissioner, exercising her powers under section 51 of the Data Protection Act 1998. It is entitled “In the picture: A data protection code of practice for surveillance cameras and personal information”, and was published in June 2017. It is clear that it covers ANPR. It also refers to data protection impact assessments, which we debated last week. It clearly states that where the police and others use or intend to use an ANPR system, it is important that they
“undertake a privacy impact assessment to justify its use and show that its introduction is proportionate and necessary.”
The second code is brought under section 29 of the Protection of Freedoms Act 2012, which required the Secretary of State to issue a code of practice containing guidance about surveillance camera systems. The “Surveillance camera code of practice”, published in June 2013, already covers the use of ANPR systems by the police and others. It sets out 12 guiding principles for system operators. Privacy is very much a part of that. The Protection of Freedoms Act established the office of the Surveillance Camera Commissioner, who has a number of statutory functions in relation to the code, including keeping its operation under review.
In addition, a published memorandum of understanding between the Surveillance Camera Commissioner and the Information Commissioner sets out how they will work together. We also have the general public law principles of the Human Rights Act 1998 and the European convention on human rights. I hope that the two codes I have outlined, the Protection of Freedoms Act and the Human Rights Act reassure the right hon. Gentleman, and that he will withdraw his new clause.
I am indeed mollified. I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
New Clause 21
Targeted dissemination disclosure notice for third parties and others (No. 2)
“In Schedule 19B of the Political Parties, Elections and Referendums Act 2000 (Power to require disclosure), after paragraph 10 (documents in electronic form) insert—
10A (1) This paragraph applies to the following organisations and individuals—
(a) a recognised third party (within the meaning of Part 6);
(b) a permitted participant (within the meaning of Part 7);
(c) a regulated donee (within the meaning of Schedule 7);
(d) a regulated participant (within the meaning of Schedule 7A);
(e) a candidate at an election (other than a local government election in Scotland);
(f) the election agent for such a candidate;
(g) an organisation or a person notified under subsection 2 of this section;
(h) an organisation or individual formerly falling within any of paragraphs (a) to (g); or
(i) the treasurer, director, or another officer of an organisation to which this paragraph applies, or has been at any time in the period of five years ending with the day on which the notice is given.
(2) The Commission may under this paragraph issue at any time a targeted dissemination disclosure notice, requiring disclosure of any settings used to disseminate material which it believes were intended to have the effect, or were likely to have the effect, of influencing public opinion in any part of the United Kingdom, ahead of a specific election or referendum, where the platform for dissemination allows for targeting based on demographic or other information about individuals, including information gathered by information society services.
(3) This power shall not be available in respect of registered parties or their officers, save where they separately and independently fall into one or more of categories (a) to (i) of sub-paragraph (1).
(4) A person or organisation to whom such a targeted dissemination disclosure notice is given shall comply with it within such time as is specified in the notice.”
This new clause would amend the Political Parties, Elections and Referendums Act 2000 to allow the Electoral Commission to require disclosure of settings used to disseminate material where the platform for dissemination allows for targeting based on demographic or other information about individuals.—(Liam Byrne.)
Brought up, and read the First time.
With this it will be convenient to discuss new clause 22—Election material: personal data gathered by information society services—
“In section 143 of the Political Parties, Elections and Referendums Act 2000 (Details to appear on electoral material), leave out subsection (1)(b) and insert—
(b) in the case of any other material, including material disseminated through the use of personal data gathered by information society services, any requirements falling to be complied with in relation to the material by virtue of regulations under subsection (6) are complied with.”
This new clause would amend the Political Parties, Elections and Referendums Act 2000 to ensure that “any other material” clearly can be read to include election material disseminated through the use of personal data gathered by information society services.
I am happy to end on a note of cross-party consensus. We agree that we need to modernise our hopelessly outdated election laws. The news a couple of hours ago that the Information Commissioner’s application for a search warrant at Cambridge Analytica has been deferred—suspended until tomorrow—underlines the fact that the laws we have today for investigating malpractice that may impinge on the health of our democracy are hopelessly inadequate. The Information Commissioner declared to the world—for some reason on live television on Monday—that she was seeking a warrant to get into Cambridge Analytica’s office. Five days later there is still no search warrant issued by a court. Indeed, the court has adjourned the case until tomorrow.
I suspect that Cambridge Analytica has now had quite enough notice to do whatever it likes to the evidence that the Information Commissioner sought. This basket of clauses seeks to insert common-sense provisions to update the law in a way that will ensure that the data protection regime we put in place safeguards the health and wellbeing of our democracy. We need those because of what we now know about allegedly bad companies such as Cambridge Analytica, and because of what we absolutely know about bad countries such as Russia. We have been slow to wake up to the reality that, since 2012, Russia has been operating a new generation of active measures that seek to divide and rule its enemies.
There is no legal definition of hybrid war, so there is no concept of just war when it comes to hybrid war. There is no Geneva convention for hybrid war that defines what is good and what is bad and what is legal and illegal, but most legal scholars agree that a definition of hybrid war basically touches on a form of intervening against enemies in a way that is deniable and sometimes not traceable. It contains a basket of measures and includes the kind of tactics that we saw deployed in Crimea and Ukraine, which were of course perfected after the invasion of Georgia. We see it in the Baltics and now we see it not just in America but across western Europe as well.
Such a technique—a kind of warcraft of active measures—has a very long history in Russia. Major-General Kalugin, the KGB’s highest ranking defector, once described the approach as the “heart and soul” of Soviet intelligence. The challenge today is that that philosophy was comprehensively updated by General Gerasimov, the Russian Army’s chief of staff, and it came alongside a very different world view presented by President Putin after his re-election as President in 2012 and in his first state of the union address in 2013. It was in that address that President Putin attacked what he called a de-Christianised morally ambivalent west. He set out pretty categorically a foreign policy of contention rather than co-operation.
Since 2012, we have seen what is basically a history of tactical opportunism. A little bit unlike the Soviet era, what we now have are sometimes authorised groups, sometimes rogue groups, seeking openings where they can and putting in place disruptive measures. They are most dangerous when they target the messiness of digital democracy. Here we have a kind of perfection of what I have called in the past a dark social playbook—for example, hackers such as Cozy Bear or Fancy Bear attacked the Democratic National Committee during the American elections.
We also have a partnership with useful idiots such as WikiLeaks, an unholy alliance with what are politely called fake news sites such as Westmonster or indeed Russia Today or Breitbart, which spread hatred. We have a spillover into Twitter. Once a row is brewing on Twitter, we get troll farms such as the Internet Research Agency in St Petersburg kicking in. Half of the tweets about NATO in the Baltics are delivered by robo-trolls out of Russia. It is on an absolutely enormous scale. Once the row is cooking on Twitter, we get the import into Facebook groups. They are private groups and dark groups, and it is perfectly possible to switch on dark money behind those ads circulating the hate material to thousands and thousands if not millions.
We know that that was standard practice in the German and French elections. There is a risk—we do not know what the risk is because the Government will not launch an inquiry—that such activity was going on in the Brexit campaign. I anticipate that there will be more revelations about that this weekend. However, the challenge is that our election law is now hopelessly out of date.
I will be brief in answering some of the serious matters raised by the right hon. Gentleman. The Information Commissioner, as the data regulator, is investigating alleged abuses as part of a broader investigation into the use of personal data during political campaigns. I have said many times that the Bill will add significantly to the commissioner’s powers to conduct investigations, and I have confirmed that we keep an open mind and are considering actively whether further powers are needed in addition to those set out in the Bill.
The Electoral Commission is the regulator of political funding and spending. The commission seeks to bring transparency to our electoral system by enforcing rules on who can fund and how money can be spent, but new clause 21 is about sending the commission into a whole new field: that of personal data regulation. That field is rightly occupied by the Information Commissioner. We can debate whether she needs more powers in the light of the current situation at Cambridge Analytica, and as I have said we are reviewing the Bill.
While the Electoral Commission already has the power to require the disclosure of documents in relation to investigations under its current remit, new clause 21 would provide the commission with new powers to require the disclosure of the settings used to disseminate material. However, understanding how personal data is processed is outside the commission’s remit.
The right hon. Gentleman suggested that his amendment would help with transparency on who is seeking to influence elections, which is very much needed in the current climate. The Government take the security and integrity of democratic processes very seriously. It is absolutely unacceptable for any third country to interfere in our democratic elections or referendums.
On new clause 22, the rules on imprints in the Political Parties, Elections and Referendums Act 2000 are clear. The current rules apply to printed election material no matter how it is targeted. However, the Secretary of State has the power under section 143 to make regulations covering imprints on other types of material, including online material. New clause 22 would therefore not extend the type of online material covered by such regulations. We therefore believe the new clause is unnecessary. The law already includes printed election material disseminated through the use of personal data gathered by whatever means, and the Government will provide further clarity on extending those rules to online material in due course by consulting on making regulations under the power in section 143(6).
On that basis, I ask the right hon. Gentleman to withdraw his new clause.
That is a deeply disappointing answer. I was under the impression that the Secretary of State said in interviews today that he is open-minded about the UK version of the Honest Ads Act that we propose. That appears to be in some contrast to the answer that the Minister offered.
What this country has today is an Advertising Standards Authority that does not regulate political advertising; Ofcom, which does not regulate video when it is online; an Electoral Commission without the power to investigate digital campaigning; and an Information Commissioner who cannot get a search warrant. Worse, we have a Financial Conduct Authority that, because it does not have a data sharing gateway with the Electoral Commission, cannot share information about the financial background of companies that might have been laundering money going into political and referendum campaigns. The law is hopelessly inadequate. Through that great hole, our enemies are driving a coach and horses, which is having a huge impact on the health and wellbeing of our democracy.
That is not a day-to-day concern in Labour constituencies, but it is for the Conservative party. Voter Consultancy Ltd took out targeted dark social ads aimed at Conservative Members, accusing some of them of being Brexit mutineers when they had the temerity to vote for common sense in a vote on Brexit in this House. Voter Consultancy Ltd, for those who have not studied its financial records at Companies House, as I have, is a dormant company. It has no accounts filed. There is no cash flowing through the books. The question that provokes is: where does the money come from for the dark social ads attacking Conservative Members? We do not know. It is a matter of public concern that we should.
The law is out of date and needs to be updated. I will not press the matter to a vote this afternoon because I hope to return to it on Report, but I hope that between now and then the Minister and the Secretary of State reflect on the argument and talk to Mark Sedwill, the National Security Adviser, about why the national security strategy does not include an explicit objective to defend the integrity of our democracy. I hope that that change is made and that, as a consequence, further amendments will be tabled to ensure that our democracy is protected against the threats we know are out there.
I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
Question proposed, That the Chair do report the Bill, as amended, to the House.
On a point of order, Mr Streeter. I wanted to thank you, and Mr Hanson in his absence, as well as, in the House of Lords, my noble Friends Lord Ashton, Baroness Williams, Lord Keen, Baroness Chisholm and Lord Young, and the Opposition and Cross-Bench peers. I also thank the Under-Secretary of State for the Home Department, my hon. Friend the Member for Louth and Horncastle, and the Opposition Front Bench Members—the right hon. Member for Birmingham, Hodge Hill, with whom it has been a pleasure debating in the past two weeks, and the hon. Member for Sheffield, Heeley, who was not able to be in her place this afternoon.
I offer great thanks to both Whips. It was the first Bill Committee for my hon. Friend the Member for Selby and Ainsty in his capacity as Whip, and my first as Minister, and it has been a pleasure to work with him. I also thank the hon. Member for Ogmore. My hon. Friend the Under-Secretary and I are grateful to our Parliamentary Private Secretary, my hon. Friend the Member for Mid Worcestershire, who has worked terribly hard throughout the proceedings, as indeed have the Clerks, the Hansard writers, the Doorkeepers and the police. Without the officials of my Department and, indeed, the Home Office, we would all have been bereft, and I am most grateful to all the officials.
Question put and agreed to.
Bill, as amended, accordingly to be reported.
(6 years, 8 months ago)
Public Bill CommitteesWe begin consideration of the Bill today with schedule 9, to which no amendments have been tabled.
Schedule 9 agreed to.
Schedule 10
Conditions for sensitive processing under Part 4
Amendment made: 117, in schedule 10, page 187, line 5, at end insert—
‘Safeguarding of children and of individuals at risk
3A (1) This condition is met if—
(a) the processing is necessary for the purposes of—
(i) protecting an individual from neglect or physical, mental or emotional harm, or
(ii) protecting the physical, mental or emotional well-being of an individual,
(b) the individual is—
(i) aged under 18, or
(ii) aged 18 or over and at risk,
(c) the processing is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) the processing is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) For the purposes of this paragraph, an individual aged 18 or over is “at risk” if the controller has reasonable cause to suspect that the individual—
(a) has needs for care and support,
(b) is experiencing, or at risk of, neglect or physical, mental or emotional harm, and
(c) as a result of those needs is unable to protect himself or herself against the neglect or harm or the risk of it.
(4) In sub-paragraph (1)(a), the reference to the protection of an individual or of the well-being of an individual includes both protection relating to a particular individual and protection relating to a type of individual.’—(Victoria Atkins.)
Schedule 10 makes provision about the circumstances in which the processing of special categories of personal data is permitted. This amendment adds to that Schedule certain processing of personal data which is necessary for the protection of children or of adults at risk. See also Amendments 85 and 116.
Schedule 10, as amended, agreed to.
Clauses 87 to 93 ordered to stand part of the Bill.
Clause 94
Right of access
Amendments made: 35, in clause 94, page 55, line 8, leave out ‘day’ and insert ‘time’
This amendment is consequential on Amendment 71.
36, in clause 94, page 55, line 9, leave out ‘day’ and insert ‘time’
This amendment is consequential on Amendment 71.
37, in clause 94, page 55, line 10, leave out ‘days’
This amendment is consequential on Amendment 71.
38, in clause 94, page 55, line 11, leave out ‘the day on which’ and insert ‘when’
This amendment is consequential on Amendment 71.
39, in clause 94, page 55, line 12, leave out ‘the day on which’ and insert ‘when’
This amendment is consequential on Amendment 71.
40, in clause 94, page 55, line 13, leave out ‘the day on which’ and insert ‘when’ —(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 94, as amended, ordered to stand part of the Bill.
Clause 95 ordered to stand part of the Bill.
Clause 96
Right not to be subject to automated decision-making
Question proposed, That the clause stand part of the Bill.
We are rattling through the Bill this morning and will soon reach clause 109, to which we have tabled some amendments. Clause 96, within chapter 3 of part 4, on intelligence services processing, touches on the right not to be subject to automated decision making. I do not want to rehearse the debate that we shall have later, but I think that this is the appropriate point for an explanation from the Minister. Perhaps she will say something about the kind of administration that the clause covers, and its relationship, if any—there may not be one, but it is important to test that question—to automated data-gathering by our intelligence services abroad, and the processing and use of that data.
The specific instance that I want to take up concerns the fact that about 700 British citizens have gone to fight in foreign conflicts—for ISIS in particular. The battery of intelligence-gathering facilities that we have allows us to use remote data-sensing to detect, track and monitor them, and to assemble pictures of their patterns of life and behaviour. It is then possible for our intelligence services to do stuff with those data and patterns, such as transfer them to the military or to foreign militaries in coalitions of which we are a member. For the benefit of the Committee, will the Minister spell out whether the clause, and potentially clause 97, will bite on that kind of capability? If not, where are they aimed?
An intelligence services example under clause 96 would be a case where the intelligence services wanted to identify a subject of interest who might have travelled to Syria in a certain time window and where the initial selector was age, because there was reliable reporting that the person being sought was a certain age. The application of the age selector would produce a pool of results, and a decision may be taken to select that pool for further processing operations, including the application of other selectors. That processing would be the result of a decision taken solely on the basis of automated processing.
I do not think the clause actually says anything about age selection. How do we set boundaries around the clause? Let us say that minors—people under the age of 18—want to travel to Syria or some other war zone. Is the Minister basically saying that the clause will bite on that kind of information and lead to a decision chain that results in action to intervene? If that is the case, will she say a little more about the boundaries around the use of the clause?
The right hon. Gentleman asked me for an example and I provided one. Age is not in the clause because the Government do not seek in any way to create burdens for the security services when they are trying to use data to protect this country. Given his considerable experience in the Home Office, he knows that it would be very peculiar, frankly, for age to be listed specifically in the clause. The clause is drafted as it is, and I remind him that it complies with Council of Europe convention 108, which is an international agreement.
The point is that the clause does create a burden. It does not detract from a burden; it creates an obligation on intelligence services to ensure that there is not automatic decision making. We seek not to add burdens, but to question why the Minister is creating them.
The clause complies with Council of Europe convention 108. I do not know whether I can say any more.
I think we have come to a natural conclusion.
Question put and agreed to.
Clause 96 accordingly ordered to stand part of the Bill.
Clause 97
Right to intervene in automated decision-making
Amendments made: 41, in clause 97, page 56, line 34, leave out “21 days” and insert “1 month”.
Clause 97(4) provides that where a controller notifies a data subject under Clause 97(3) that the controller has taken a decision falling under Clause 97(1) (automated decisions required or authorised by law), the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.
Amendment 42, in clause 97, page 56, line 39, leave out “21 days” and insert “1 month”.—(Victoria Atkins.)
Clause 97(5) provides that where a data subject makes a request to a controller under Clause 97(4) to reconsider or retake a decision based solely on automated processing, the controller has 21 days to respond. This amendment extends that period to one month.
Clause 97, as amended, ordered to stand part of the Bill.
Clause 98
Right to information about decision-making
Question proposed, That the clause stand part of the Bill.
This is a vexed and difficult area. The subject of the clause is the right to information about decision making, which is very difficult when it comes to the intelligence services, and I have had experiences, as have others I am sure, of constituents who come along to an advice bureau and claim to have been subject either to intelligence services investigation or, in some cases, to intelligence services trying to recruit them. Sometimes—this is not unknown—an individual’s immigration status might be suspect. I had one of these cases about five or six years ago, where the allegation was that the intelligence services were conspiring with the UK Border Agency and what at that time was the Identity and Passport Service to withhold immigration documents to encourage the individual to become a source. The challenge for Members of Parliament trying to represent such individuals is that they will get a one-line response when they write to the relevant officials to say, “I am seeking to represent my constituent on this point.”
A right to information about decision-making will be created under clause 98. I ask the Minister, therefore, when dealing with very sensitive information, how is this right going to be exercised and who is going to be the judge of whether that right has been fulfilled satisfactorily? There is no point approving legislation that is superfluous because it will have no effect in the real world. The clause creates what looks like a powerful new right for individuals to request information about decisions taken by the intelligence agencies, which might have a bearing on all sorts of things in their lives. Will the Minister explain how, in practice, this right is to become a reality?
If I may give an example, where a terrorist suspect is arrested and believes he is the subject of MI5 surveillance, revealing to them whether they were under surveillance and the process by which the suspect was identified as a potential terrorist would clearly aid other terrorists in avoiding detection. The exercise of the right is subject to the operation of the national security exemption, which was debated at length last week. It might be that, in an individual case, the intelligence services need to operate the “neither confirm nor deny” principle, and that is why the clause is drafted as it is.
The clause is drafted in the opposite way. Subsection (1)(b) says that
“the data subject is entitled to obtain from the controller, on request, knowledge of the reasoning underlying the processing.”
In other words, the data subject—in this case, the individual under surveillance—has the right to obtain from the controller, in the hon. Lady’s example of the intelligence agencies, knowledge of the reasoning underlying the way their data was processed.
Let us take, for example, a situation where CCTV footage was being captured at an airport or a border crossing and that footage was being run through facial recognition software, enabling special branch officers to intervene and intercept that individual before they crossed the border. That is an example of where information is captured and processed, and action then results in an individual, in this case, being prevented from coming into the country.
I have often had cases of constituents who have come back from Pakistan or who might have transitioned through the middle east, perhaps Dubai, and they have been stopped at Birmingham airport because special branch officers have said their name is on a watch list. Watch lists are imperfect—that is probably a fairly good description. They are not necessarily based on the most reliable and up-to-date information, but advances in technology allow a much broader and more wide-ranging kind of interception to take place at the border. If we are relying not on swiping someone’s passport and getting a red flag on a watch list but on processing data coming in through CCTV and running it through facial recognition software, that is a powerful new tool in the hands of the intelligence agencies. Subsection (1)(b) will give one of my constituents the right to file a request with the data controller—presumably, the security services—and say, “Look, I think your records are wrong here. You have stopped me on the basis of facial recognition software at Birmingham airport; I want to know the reasoning behind the processing of the data.”
If, as the Minister says, the response from the data controller is, “We can neither confirm nor deny what happened in this case,” then, frankly, the clause is pretty nugatory. Will the Minister give an example of how the right is going to be made a reality? What are the scenarios in which a constituent might be able to exercise this right? I am not interested in the conventions and international agreements this happy clause tends to agree with, but I would like to hear a case study of how a constituent could exercise this right successfully.
The right hon. Gentleman says he is not interested in conventions and so on, but I am afraid that is the legal framework within which Parliament and this country have to act. The clause confers—as do the other clauses in chapter 3—rights upon citizens, but those rights are subject, as they must be, to the national security exemption set out in chapter 6, clause 110.
I am slightly at a loss as to where the right hon. Gentleman wishes to go with this. I am not going to stand here and dream up scenarios that may apply. The rights and the national security exemption are set out in the Bill; that is the framework we are looking at, and that is the framework within which the security services must operate. Of course one has a duty to one’s constituents, but that is balanced with a duty to one’s country. This is precisely the section of the Bill that is about the balance between the rights of our citizens and the absolute necessity for our security services to protect us and act in our interests when they are required to do so.
I am not asking the Minister to dream up a scenario in Committee. All good Ministers understand every single dimension of a clause they are required to take through the House before they come anywhere near a Committee, because they are the Bill Minister.
We are not debating here whether the security services have sufficient power; we had that debate earlier. We are talking about a power and a right that are conferred on data subjects under subsection (1)(b). I am slightly concerned that the Minister, who is responsible for this Bill and this matter of policy, has not been able to give us a well-rehearsed scenario, which presumably she and her officials will have considered before the Bill came anywhere near to being drafted. How will this right actually be exercised by our constituents? It could be that the Committee decides, for example, that the rights we are conferring on the data subject are too sweeping. We might be concerned that there are insufficient safeguards in place for the intelligence agencies to do their jobs. This is a specific question about how data subjects, under the clause, are going to exercise their power in a way that allows the security services to do their job. That is not a complicated request; it is a basic question.
As I say, the framework is set out in the Bill, and the exemption exists in the Bill itself. I have already given an example about a terror suspect. With respect, I am not going to enter into this debate about the right hon. Gentleman’s constituent—what he or she might have requested, and so on. The framework is there; the right is there, balanced with the national security exemption. I am not sure there is much more I can add.
The Minister says she does not want to enter into a debate. I kindly remind her that she is in a debate. The debate is called—
Order. We have a point of order—which, in due course, the good offices of Hansard will resolve—as to what was said by the right hon. Gentleman and how the Minister interpreted it. At the moment, we are dealing with clause 98 and Mr Liam Byrne has the floor. As he wishes, he can give way or continue.
I am grateful, Mr Hanson, for that complete clarity. This is the debate that we are having today: how will clause 98(1)(b) become a reality? It creates quite powerful rights for a data subject to seek information from the intelligence agencies. I gave an example from my constituency experience of how the exercise of this right could run into problems.
All I ask of the Minister responsible for the Bill and this area of policy, who has thought through the Bill with her officials and is asking the Committee to agree the power she is seeking to confer on our constituents, and who will have to operate the policy in the real world after the Bill receives Royal Assent, is that she give us a scenario of how the rights she is conferring on a data subject will function in the real world.
However, Mr Hanson, I think we might have exhausted this debate. It is disappointing that the Minister has not been able to come up with a scenario. Perhaps she would like to intervene now to give me an example.
Part 4 sets out a number of rights of data subjects, clause 98 being just one of them. This part of the Bill reflects the provisions of draft modernised convention 108, which is an international agreement, and the Bill faithfully gives effect to those provisions. A data subject wishing to exercise the right under clause 98 may write to that effect to the Security Service, which will then either respond in accordance with clause 98 or exercise the national security exemption in clause 110. That is the framework.
That is probably about as much reassurance as the Committee is going to get this afternoon. It is not especially satisfactory or illuminating, but we will not stand in the way and we will leave the debate there, Mr Hanson.
This might seem like a long day, but it is still morning. On that note, we will proceed.
Question put and agreed to.
Clause 98 accordingly ordered to stand part of the Bill.
Clause 99
Right to object to processing
Amendments made: 43, in clause 99, page 57, line 28, leave out “day” and insert “time”.
This amendment is consequential on Amendment 71.
44, in clause 99, page 58, line 3, leave out “day” and insert “time”.
This amendment is consequential on Amendment 71.
45, in clause 99, page 58, line 5, leave out “the day on which” and insert “when”.
This amendment is consequential on Amendment 71.
46, in clause 99, page 58, line 6, leave out “the day on which” and insert “when”.—(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 99, as amended, ordered to stand part of the Bill.
Clauses 100 to 108 ordered to stand part of the Bill.
Clause 109
Transfers of personal data outside the United Kingdom
I beg to move amendment 159, in clause 109, page 61, line 13, after “is” insert “provided by law and is”.
This amendment would place meaningful safeguards on the sharing of data by the intelligence agencies.
With this it will be convenient to discuss the following:
Amendment 160, in clause 109, page 61, line 18, at end insert—
‘(3) The transfer falls within this subsection if the transfer—
(a) is based on an adequacy decision (see section 74),
(b) if not based on an adequacy decision, is based on there being appropriate safeguards (see section 75), or
(c) if not based on an adequacy decision or on there being appropriate safeguards, is based on special circumstances (see section 76 as amended by subsection (5)).
(4) A transfer falls within this subsection if—
(a) the intended recipient is a person based in a third country that has (in that country) functions comparable to those of the controller or an international organisation, and
(b) the transfer meets the following conditions—
(i) the transfer is strictly necessary in a specific case for the performance of a task of the transferring controller as provided by law or for the purposes set out in subsection (2),
(ii) the transferring controller has determined that there are no fundamental rights and freedoms of the data subject concerned that override the public interest necessitating the transfer,
(iii) the transferring controller informs the intended recipient of the specific purpose or purposes for which the personal data may, so far as necessary, be processed, and
(iv) the transferring controller documents any transfer and informs the Commissioner about the transfer on request.
(5) The reference to law enforcement purposes in subsection (4) of section 76 is to be read as a reference to the purposes set out in subsection (2).”
New clause 14—Subsequent transfers—
‘(1) Where personal data is transferred in accordance with section 109, the transferring controller must make it a condition of the transfer that the data is not to be further transferred to a third country or international organisation without the authorisation of the transferring controller.
(2) A transferring controller may give an authorisation under subsection (1) only where the further transfer is necessary for the purposes in subsection (2).
(3) In deciding whether to give the authorisation, the transferring controller must take into account (among any other relevant factors)—
(a) the seriousness of the circumstances leading to the request for authorisation,
(b) the purpose for which the personal data was originally transferred, and
(c) the standards for the protection of personal data that apply in the third country or international organisation to which the personal data would be transferred.’
This new clause would place meaningful safeguards on the sharing of data by the intelligence agencies.
I rise to speak to amendments 159 and 160, which relate to two significant developments in defence policy that have unfolded over the past couple of years. Our intelligence agencies have acquired pretty substantial new capabilities through all kinds of technological advances, which allow them remotely to collect and process data in a completely new way.
It is now possible, through satellite technology and drones, to collect video footage of battle zones and run the information collected through facial recognition software, which allows us to track much more forensically and accurately the movement, habits, working lives and leisure of bad people in bad places. We are fighting against organisations such as Daesh, in a coalition with allies, but over the past year one of our allies has rather changed the rules of engagement, which allows it to take drone strikes with a different kind of flexibility from that under the Obama regime.
The change in the American rules of engagement means that, on the one hand, the American Administration has dramatically increased the number of drone strikes—in Yemen, we have had an increase of about 288% in the past year—and, on the other, as we see in other theatres of conflict such as the war against al-Shabaab in Africa, repeated strikes are allowed for. Therefore, even when the circumstances around particular individuals have changed—new intelligence may have come to light about them—the Trump Administration have basically removed the safeguards that President Obama had in place that require an individual to be a “continuing and imminent threat” before a strike is authorised. That safeguard has been lifted, so the target pool that American forces can take aim at and engage is now much larger, and operational commanders have a great deal more flexibility over when they can strike.
We now see some of the consequences of that policy, with the most alarming statistics being on the number of civilians caught up in some of those strikes. That is true in Yemen and in the fight against al-Shabaab, and I suspect it is true in Syria, Afghanistan and, in some cases, Pakistan. We must ensure that the data sharing regime under which our intelligence agencies operate does not create a legal threat to them because of the way the rules of engagement of one of our allies have changed.
The Joint Committee on Human Rights has talked about that, and it has been the subject of debates elsewhere in Parliament. The JCHR concluded in its 2016 report that
“we owe it to all those involved in the chain of command for such uses of lethal force—intelligence personnel, armed services personnel, officials, Ministers and others—to provide them with absolute clarity about the circumstances in which they will have a defence against any possible future criminal prosecution, including those which might originate from outside the UK.”
We need to reflect on some of those legal risks to individuals who are serving their country. The amendment would ensure that—where there was a collection, processing and transfer of information by the UK intelligence services to one of our allies, principally America, and they ran that information against what is widely reported as a kill list and ordered drone strikes without some of the safeguards operated by previous Administrations—first, the decision taken by the intelligence agency here to share that information was legal and, secondly, it would be undertaken in a way that ensured that our serving personnel were not subject to legal threats or concerns about legal threats.
Does the right hon. Gentleman agree that the legal framework that we rightly expect to apply to our law enforcement offers and agencies does not necessarily apply directly to our intelligence and security services? That, however, would be the effect of the amendment.
I am not sure that that would be the effect of the amendment. While I agree with the thrust of the hon. Gentleman’s argument, I am cognisant of the fact that in 2013 the Court of the Appeal said that it was “certainly not clear” that UK personnel would be immune from criminal liability for their involvement in a programme that entailed the transfer of information to America and a drone strike ordered using that information, without the same kinds of safeguard that the Obama Administration had. The amendment would ensure a measure—nothing stronger than that—of judicial oversight where such decisions were taken and where information was transferred. We must ensure a level of judicial oversight so that inappropriate decisions are not taken. It is sad that we need such a measure, but it reflects two significant changes over the past year or two: first, the dramatic increase in our ability to capture and process information, and, secondly, the crucial change in the rules of engagement under the Trump Administration.
The right hon. Gentleman is being kind and generous with his time. He says that the amendments would not replicate the frameworks for law enforcement, yet amendment 160 would do exactly that by applying clauses 74, 75 and 76 to the test for data sharing for intelligence and security services. Those exact safeguards were designed for law enforcement, not for intelligence and security sharing.
The point for the Committee is that the thrust of the amendment is not unreasonable. Where there is a multiplication of the power of intelligence agencies to capture and process data, it is not unreasonable to ask for that greater power to bring with it greater scrutiny and safeguards. The case for this sensible and cautious amendment is sharpened because of the change in the rules of engagement operated by the United States. No member of the Committee wants a situation where information is transferred to an ally, and that ally takes a decision that dramatically affects the human rights of an individual—as in, it ends those rights by killing that person. That is not something that we necessarily want to facilitate.
As has been said, we are conscious of the difficulty and care with which our politicians have sometimes had to take such decisions. The former Prime Minister very sensibly came to the House to speak about his decision to authorise a drone strike to kill two British citizens whom he said were actively engaged in conspiring to commit mass murder in the United Kingdom. His judgment was that those individuals posed an imminent threat, but because they were not operating in a place where the rule of law was operational, there was no possibility to send in the cops, arrest them and bring them to trial.
The Prime Minister was therefore out of options, but the care that he took when taking that decision and the level of legal advice that he relied on were extremely high. I do not think any member of the Committee is confident that the care taken by David Cameron when he made that decision is replicated in President Trump’s White House.
We must genuinely be concerned and cautious about our intelligence agencies transferring information that is then misused and results in drone strikes that kill individuals, without the safeguards we would expect. The last thing anyone would want is a blowback, in either an American or a British court, on serving officers in our military or intelligence services because the requisite safeguards simply were not in place.
My appeal to the Committee is that this is a point of principle: enhanced power should bring with it enhanced oversight and surveillance, and the priority for that is the fact that the rules of engagement for the United States have changed. If there is a wiser way in which we can create the kinds of safeguard included in the amendment we will be all ears, but we in the House of Commons cannot allow the situation to go unchecked. It is too dangerous and too risky, and it poses too fundamental a challenge to the human rights that this place was set up to champion and protect.
Before I start, I want to clarify what the hon. Gentleman has just said about adequacy decisions. Canada does have an adequacy decision from the EU for transfers to commercial organisations that are subject to the Canadian Personal Information Protection and Electronic Documents Act. I am not sure that security services are covered in that adequacy decision, but it may be that we will get assistance elsewhere.
As the right hon. Member for Birmingham, Hodge Hill is aware, amendments 159, 160 and new clause 14 were proposed by a campaigning organisation called Reprieve in its recent briefing on the Bill. They relate to concerns about the sharing of personal data with the US and seek to apply the data sharing protections designed specifically for law enforcement data processing, provided for in part 3 of the Bill, to processing by the intelligence services, provided for in part 4. That is, they are seeking to transpose all the law enforcement measures into the security services. However, such safeguards are clearly not designed for, and do not provide, an appropriate or proportionate basis for the unique nature of intelligence services processing, which we are clear is outside the scope of EU law.
Before I get into the detail of these amendments, it is important to put on record that the international transfer of personal data is vital to the intelligence services’ ability to counter threats to national security. Provision of data to international partners bolsters their ability to counter threats to their security and that of the UK. In a globalised world, threats are not necessarily contained within one country, and the UK cannot work in isolation. As terrorists do not view national borders as a limit to their activities, the intelligence services must be in a position to operate across borders and share information quickly—for example, about the nature of the threat that an individual poses—to protect the UK.
In the vast majority of cases, intelligence sharing takes place with countries with which the intelligence services have long-standing and well-established relationships. In all cases, however, the intelligence services apply robust necessity and proportionality tests before sharing any information. The inherent risk of sharing information must be balanced against the risk to national security of not sharing such information.
Will the Minister tell us more about the oversight and scrutiny for the tests that she has just set out that the intelligence services operate? Perhaps she will come on to that.
I am coming on to that.
Any cross-border sharing of personal data must be consistent with our international obligations and be subject to appropriate safeguards. On the first point, the provisions in clause 109 are entirely consistent with the requirements of the draft modernised Council of Europe data protection convention—convention 108—on which the preventions of part 4 are based. It is pending international agreement.
The provisions in the convention are designed to provide the necessary protection for personal data in the context of national security. The Bill already provides that the intelligence services can make transfers outside the UK only when necessary and proportionate for the limited purposes of the services’ statutory functions, which include the protection of national security; for the purpose of preventing or detecting serious crime; or for the purpose of criminal proceedings.
In addition, on the point the right hon. Gentleman just raised, the intelligence services are already under statutory obligations in the Security Service Act 1989 and the Intelligence Services Act 1994 to ensure that no information is disclosed except so far as is necessary for those functions or purposes. All actions by the intelligence services, as with all other UK public authorities, must comply with international law.
It is absolutely vital. What is more, not only is there a framework in the Bill for overseeing the work of the intelligence services, but we have the added safeguards of the other legislation that I set out. The burden on the security services and the thresholds they have to meet are very clear, and they are set out not just in the Bill but in other statutes.
I hope that I have provided reassurance that international transfers of personal data by the intelligence services are appropriately regulated both by the Bill, which, as I said, is entirely consistent with draft modernised convention 108 of the Council of Europe—that is important, because it is the international agreement that will potentially underpin the Bill and agreements with our partners and sets out agreed international standards in this area—and by other legislation, including the 2016 Act. We and the intelligence services are absolutely clear that to attempt to impose, through these amendments, a regime that was specifically not designed to apply to processing by the intelligence services would be disproportionate and may critically damage national security.
I am sure that it is not the intention of the right hon. Member for Birmingham, Hodge Hill to place unnecessary and burdensome obstacles in the way of the intelligence services in performing their crucial function of safeguarding national security, but, sadly, that is what his amendments would do. I therefore invite him to withdraw them.
I am grateful to the Minister for that explanation and for setting out with such clarity the regime of oversight and scrutiny that is currently in place. However, I have a couple of challenges.
I was slightly surprised that the Minister said nothing about the additional risks created by the change in rules of engagement by the United States. She rested some of her argument on the Security Services Act 1989 and the Intelligence Services Act 1994, which, as she said, require that any transfers of information are lawful and proportionate. That creates a complicated set of ambiguities for serving frontline intelligence officers, who have to make fine judgments and, in drafting codes of practice, often look at debates such as this one and at the law. However, the law is what we are debating. Where the Bill changed the law to create a degree of flexibility, it would create a new risk, and that risk would be heightened by the change in the rules of engagement by one of our allies.
The Minister may therefore want to reflect on a couple of points. First, what debate has there been about codes of practice? Have they changed given the increased surveillance capacity that we have because of the development of our capabilities? How have they changed in the light of the new rules of engagement issued by President Trump?
The right hon. Gentleman is being generous in giving way. I am listening carefully to what he says. I am concerned that he seems to be inviting us to make law in this country based almost solely on the policies of the current US Administration. I do not understand why we would do that.
The reason we would do that is that there has been an exponential increase in drone strikes by President Trump’s Administration and, as a result, a significant increase in civilian deaths in Pakistan, Afghanistan, Syria and Iraq, Yemen and east Africa. It would be pretty odd for us not to ensure that a piece of legislation had appropriate safeguards, given what we now know about the ambition of one of our most important allies to create flexibility in rules of engagement.
I agree with the right hon. Gentleman on that point, but is not the more important point that our legislation cannot be contingent on that of any other country, however important an ally it is? Our legislation has to stand on its own two feet, and we should seek to ensure that it does. To change something, as he attempts to, purely on the basis of changes over the past couple of years would set a dangerous precedent rather than guard against a potential pitfall.
The hon. Gentleman makes a good point, and he is right to say that our legislation has to stand on its own two feet. It absolutely has to, and what is more, it has to be fit for the world in which we live today, which I am afraid has two significant changes afoot. One is a transformation in the power of our intelligence agencies to collect and process data, and in my view that significant advance is enough to require a change in the level of oversight, and potentially a judicial test for the way we share information. As it happens—I was careful to say this—the risk and necessity of that change is merely heightened by the fact that the rules of engagement with one of our most important allies have changed, and that has had real-world consequences. Those consequences create a heightened threat of legal challenge in foreign and indeed domestic courts to our serving personnel.
For some time, our defence philosophy has been—very wisely—that we cannot keep our country safe by defending from the goal line, and on occasion we have to intervene abroad. That is why in my view Prime Minister Cameron took the right decision to authorise lethal strikes against two British citizens. He was concerned first that there was an imminent threat, and secondly that there was no other means of stopping them. Those important tests and safeguards are not operated by our allies.
The change to the American rules of engagement, which allow a strike against someone who is no longer a “continuing and imminent threat”, means that one of our allies now operates under completely different rules of engagement to those set out before the House of Commons by Prime Minister David Cameron, which I think met with some degree of approval. If we are to continue to operate safely a policy of not defending from the goal line, if we are to protect our ability to work with allies and—where necessary and in accordance with international law—to take action abroad, and if we are to continue the vital business of safely sharing information with our allies in the Five Eyes network, a degree of extra reassurance should be built into legislation to ensure that it is fit for the future.
I am confused. Is the right hon. Gentleman suggesting that the actions by Americans, based on the data sharing, which we know is run with international safeguards, could have legal consequences for our personnel in the intelligence agencies serving here?
Yes, and it is not just me—the Court of Appeal is arguing that. The Court of Appeal’s summary in 2013 was that there was a risky legal ambiguity. Its conclusion that it is certainly not clear that UK personnel are immune from criminal liability for their involvement in these programmes is a concern for us all. The Joint Committee on Human Rights reflected on that in 2016, and it concluded pretty much the same thing:
“In our view, we owe it to all those involved in the chain of command for such uses of lethal force…to provide them with absolute clarity about the circumstances in which they will have a defence against any possible future criminal prosecution, including those which might originate from outside the UK.”
This is not a theoretical legal threat to our armed forces and intelligence agencies; this is something that the Court of Appeal and the Joint Committee on Human Rights have expressed worries about.
The new powers and capabilities of our intelligence agencies arguably create the need for greater levels of oversight. This is a pressing need because of the operational policy of one of our allies. We owe it to our armed forces and intelligence agencies to ensure a regime in which they can take clear, unambiguous judgments where possible, and where they are, beyond doubt, safe from future legal challenge. It is not clear to me that the safeguards that the Minister has set out meet those tests.
Perhaps the Minister will clarify one outstanding matter, about convention 108, on which she rested much of her argument. Convention 108 is important. It was written in 1981. The Minister told the Committee that it had been modernised, but also said that that was in draft. I should be grateful for clarification of whether the United Kingdom has signed and is therefore bound by a modernised convention that is currently draft.
I am happy to clarify that. Convention 108 is in the process of being modernised by international partners. I have made it clear, last week and this week, that the version in question is modernised, and is a draft version; but it is the one to which we are committed, not least because the Bill reflects its provisions. Convention 108 is an international agreement and sets the international standards, which is precisely why we are incorporating those standards into the Bill.
I know that the Leader of Her Majesty’s Opposition appears to be stepping away from the international community, over the most recent matters to do with Russia, but the Bill and convention—[Interruption.] Well, he is. However, convention 108 is about stepping alongside our international partners, agreeing international standards and putting the thresholds into legislation. The right hon. Gentleman keeps talking about the need for legislation fit for the world we live in today; that is precisely what convention 108 is about.
Order. The right hon. Member for Birmingham, Hodge Hill indicates that this is an intervention. I thought he had sat down and wanted the Minister to respond. However, if it is an intervention, it is far too long.
I am grateful. Some of us in this House have been making the argument about the risk from Russia for months, and the permissive environment that has allowed the threats to multiply is, I am afraid, the product of much of the inattention of the past seven years.
On the specific point about convention 108, I am glad that the Minister has been able to clarify the fact that it is not operational.
I will give way to the Minister in a moment. The convention was written in 1981. Many people in the Government have argued in the past that we should withdraw not only from the European Union but from the European convention on human rights and therefore also the Council of Europe.
I did not say it was Government policy. I said that there are people within the Administration, including the Secretary of State for Environment, Food and Rural Affairs, who have made the argument for a British Bill of Rights that would remove Britain from the European convention on human rights and, therefore, the Council of Europe. I very much hope that that ambiguity has been settled and that the policy of the current Government will remain that of the Conservative party from now until kingdom come; but the key point for the Committee is that convention 108 is in draft. The modernisation is in draft and is not yet signed. We have heard an express commitment from the Minister to the signing of the thing when it is finalised. We hope that she will remain in her position, to ensure that that will continue to be Government policy; but the modernised version that has been drafted is not yet a convention.
Does my right hon. Friend recognise that the modernisation process started in 2009, with rapporteurs including one of our former colleagues, Lord Prescott? When a process has taken quite so many years and the document is still in draft, it raises the question of how modern the modernisation is.
Some members of the Committee—I am one of them—have been members of the Parliamentary Assembly of the Council of Europe for some time. We know how the Council of Europe works. It is not rapid: it likes to take its time deliberating on things. The Minister may correct me, but I do not think that there is a deadline for the finalisation of the draft convention. So, to ensure that the Government remain absolutely focused on the subject, we will put the amendment to a vote.
Question put, That the amendment be made.
I am ill-qualified to answer the hon. Gentleman’s question. Hypothetically, it would probably make it more difficult, but that is not our purpose in objecting to clause 121, which we do not see as being consistent with the role of the Information Commissioner, for the reasons I set out. However, he raises an interesting question.
I agree with Lord Mitchell that the issues that surround data protection policy, particularly with regard to NHS patient data, deserve proper attention both by the Government and by the National Data Guardian for Health and Care, but we have not yet established that there is any evidence of a problem to which his provisions are the answer. We are not sitting on our laurels. As I have already said, NHS England and the Department of Health and Social Care are working to ensure that they understand the value of their data assets. Further work on the Government’s digital charter will also explore this issue. When my right hon. friend the Prime Minister launched the digital charter on 25 January, she made it clear that we will set out principles on the use of personal data.
Amendment 122 removes Lord Mitchell’s amendment from schedule 13. We do this because it is the wrong tool; however, we commit to doing everything we can to ensure that we further explore the issue and find the right tools if needed. [Interruption.] I have just received advice that the amendments will make no difference in relation to the hon. Gentleman’s question, because anonymised data is not personal data.
I commend amendment 122 and give notice that the Government will oppose the motion that clause 121 stand part of the Bill.
I am grateful that the Minister made time to meet my former noble Friend Lord Mitchell. These are important amendments and it is worth setting out the background to why Lord Mitchell moved them and why we give such priority to them.
In 2009-10, we began to have a debate in government about the right approach to those agencies which happen to sit on an enormous amount of important data. The Government operate about 200 to 250 agencies, and some are blessed with data assets that are more valuable than those of others—for example, the Land Registry or Companies House sit on vast quantities of incredibly valuable transactional data, whereas other agencies, such as the Meteorological Office, the Hydrographic Office and Ordnance Survey, sit on sometimes quite static data which is of value. Some of the most successful American companies are based on Government data—for example, The Weather Channel is one of the most valuable and is based on data issued from, I think, the US meteorological survey. A number of Government agencies are sitting on very valuable pots of data.
The debate that we began to rehearse nearly 10 years ago was whether the right strategy was to create public-private partnerships around those agencies, or whether more value would be created for the UK economy by simply releasing that data into the public domain. I had the great pleasure of being Chief Secretary to the Treasury and the Minister for public service reform. While the strong advice inside the Treasury was that it was better to create public-private partnerships because that would release an equity yield up front, which could be used for debt reduction, it was also quite clear to officials in the Cabinet Office and those interested in public service reform more generally that the release of free data would be much more valuable. That is the side of the argument on which we came down.
After the White Paper, “Smarter Government”, that I brought to the House, we began the release of very significant batches of data. We were guided by the arguments of Tim Berners-Lee and Professor Nigel Shadbolt, who were advising us at the time, that this was the right approach and it was very good to see the Government continue with that.
There are still huge data pots locked up in Government which could do with releasing, but the way in which we release them has to have an eye on the way we create value for taxpayers more generally. Beyond doubt, the area of public policy and public operations where we have data that is of the most value is health. The way in which, in the United States, Apple and other companies have now moved into personal health technology in a substantial way betrays the reality that this is going to be a hugely valuable and important market in years to come. If we look at the US venture industry we can see significant investment now going into health technology companies.
The Minister is very generous. From that vantage point in the City, I was able to watch the level of ingenuity, creativity and innovation that was unlocked simply by the Government telling the world, “Here are the assets that are in public hands.” All sorts of ideas were floated for using those assets in a way that was better for taxpayers and public service delivery.
To the best of my knowledge, we do not have a similar data catalogue today. What Lord Mitchell is asking is for Ministers to do some work and create one. They can outsource that task to the Information Commissioner. Perhaps the Information Commissioner is not the best guardian of that particular task, but I am frustrated and slightly disappointed that the Minister has not set out a better approach to achieving the sensible and wise proposals that Lord Mitchell has offered the Government.
The reason why it is so important in the context of the NHS is that the NHS is obviously a complicated place. It is an economy the size of Argentina’s. The last time I looked, if the NHS were a country, it would be the 13th biggest economy on earth. It is a pretty complicated place and there are many different decision makers. Indeed, there are so many decision makers now that it is impossible to get anything done within the NHS, as any constituency MP knows. So how do we ensure that, for example, in our neck of the woods, Queen Elizabeth Hospital Birmingham does not strike its own data sharing agreement with Google or DeepMind? How do we ensure that the NHS in Wales does not go in a particular direction? How do we ensure that the trust across the river does not go in a particular direction? We need to bring order to what is potentially an enormous missed opportunity over the years to come.
The starting point is for the Government, first, to ensure we have assembled a good catalogue of data assets. Secondly, they should take some decisions about whether the organisations responsible for those data assets are destined for some kind of public-private partnership, as they were debating in relation to Companies House and other agencies a couple of years ago, or whether—more wisely—we take the approach of creating a sovereign wealth fund to govern public data in this country, where we maximise the upside for taxpayers and the opportunities for good public service reform.
The example of Hinkley Point and the unfortunate example of the Google partnership with DeepMind, which ran into all kinds of problems, are not good precedents. In the absence of a better, more concrete, lower risk approach from the Government, we will have to defend Lord Mitchell’s wise clause in order to encourage the Government to come back with a better solution than the one set out for us this morning.
I enjoyed the right hon. Gentleman’s speech, as it went beyond some of the detail we are debating here today, but I was disappointed with the conclusion. I did not rest my argument on it being just too difficult to organise such a database as proposed by Lord Mitchell; there are various reasons, chief among them being that we are here to debate personal data. A lot of the databases the right hon. Gentleman referred to as being of great potential value do not contain personal data. Some do, some do not: the Land Registry does not, Companies House does, and so forth. Also, the Information Commissioner has advised that this is beyond her competence and her remit and that she is not resourced to do the job. Even the job of defining what constitutes data of public value is a matter for another organisation and not the Information Commissioner’s Office. That is my main argument, rather than it being too difficult.
Happily, what sits within the scope of a Bill is not a matter for Ministers to decide. First, we rely on the advice of parliamentary counsel, which, along with the Clerks, was clear that this amendment is well within the scope. Secondly, if the Information Commissioner is not the right individual to organise this task—heaven knows, she has her hands full this week—we would have been looking for a Government amendment proposing a better organisation, a better Ministry and a better Minister for the work.
I can only be the Minister I am. I will try to improve. I was not saying that Lord Mitchell’s amendment is not within the scope of the Bill; I was making the point that some of the databases and sources referred to by the right hon. Gentleman in his speech went into the realms of general rather than personal data. I therefore felt that was beyond the scope of the Information Commissioner’s remit.
I share the right hon. Gentleman’s appreciation of the value and the uniqueness of the NHS database. We do not see it just in terms of its monetary value; as the hon. Member for Edinburgh South made clear in his intervention, it has tremendous potential to improve the care and treatment of patients. That is the value we want to realise. I reassure the right hon. Gentleman and put it on record that it is not my place as a Minister in the Department for Digital, Culture, Media and Sport, or the place of the Bill, to safeguard the immensely valuable dataset that is the NHS’s property.
The debate rehearsed in the other place was whether we should acquiesce in a derogation that the Government have exercised to set the age of consent for personal data sharing at 13, as opposed to 16, which other countries have adopted. There was widespread concern that 13 was too young. Many members of the Committee will have experienced pressing the agree button when new terms and conditions are presented to us on our updates to software on phones, or privacy settings presented to us by Facebook; privacy settings, it is now alleged, are not worth the paper that they were not written on.
Debates in the other place centred on what safeguards could be wrapped around children if that derogation were exercised and the age of consent left at 13. With Baroness Kidron, we were keen to enshrine in legislation a step towards putting into operation the objectives of the 5Rights movement. Those objectives, which Baroness Kidron has driven forward over the past few years, are important, but the rights therein are also important. They include not only rights that are enshrined in other parts of the Bill—the right to remove, for example—but important rights such as the right to know. That means that someone has the right to know whether they are being manipulated in some way, shape or form by social media technologies.
One of the most interesting aspects of the debate in the public domain in the past few months has been the revelation that many of the world’s leading social media entrepreneurs do not allow their children to use social media apps, because they know exactly how risky, dangerous and manipulative they can be. We have also heard revelations from software engineers who used to work for social media companies about the way they deliberately set out to exploit brain chemistry to create features of their apps that fostered a degree of addiction. The right to know is therefore very powerful, as is the right to digital literacy, which is another important part of the 5Rights movement.
It would be useful to hear from the Minister of State, who—let me put this beyond doubt—is an excellent Minister, what steps she plans to take to ensure that the age-appropriate design code is set out pretty quickly. We do not want the clause to be passed but then find ourselves in a situation akin to the one we are in with section 40 of the Crime and Courts Act 2013 where, five years down the line, a misguided Secretary of State decides that the world has changed completely and that this bit of legislation should not be commenced.
We would like the Minister to provide a hard timetable— she may want to write to me if she cannot do so today—setting out when we will see an age-appropriate design code. We would also like to hear what steps she will take to consult widely on the code, what work she will do with her colleagues in the Department for Education to ensure that the code includes some kind of ventilation and education in schools so that children actually know what their rights are and know about the aspects of the code that are relevant to them, and, crucially, what steps she plans to take to include children in her consultation when she draws up the code.
This is an important step forward, and we were happy to support it in the other place. We think the Government should be a little more ambitious, which is why we suggest that the rights set out by the 5Rights movement should become part of a much broader and more ambitious digital Bill of Rights for the 21st century, but a start is a start. We are pleased that the Government accepted our amendment, and we would all be grateful if the Minister told us a little more about how she plans to operationalise it.
I thank the right hon. Gentleman for his generous remarks. To recap, the idea that everyone should be empowered to take control of their data is at the heart of the Bill. That is especially important for groups such as children, who are likely to be less aware of the risks and consequences associated with data processing. Baroness Kidron raised the profile of this issue in the other place and won a great deal of support from peers on both sides of that House, and the Government then decided to introduce a new clause on age-appropriate design to strengthen children’s online rights and protections.
Clause 124 will require the Information Commissioner to develop a new statutory code that contains guidance on standards of age-appropriate design for online services that are likely to be accessed by children. The Secretary of State will work in close consultation with the commissioner to ensure that that code is robust, practical and meets children’s needs in relation to the gathering, sharing and storing of their data. The new code will ensure that websites and apps are designed to make clear what personal data of children is collected, how it is used and how both children and parents can stay in control of it. It will also include requirements for websites and app makers on privacy for children under 18.
The right hon. Gentleman cited examples of the consultation he hopes to see in preparation for the code. In developing the code, we expect the Information Commissioner to consult a wide range of stakeholders, including children, parents, persons who represent the interests of children, child development experts and trade associations. The right hon. Gentleman mentioned the Department for Education, and I see no reason why it should not be included in that group of likely consultees.
The commissioner must also pay close attention to the fact that children have different needs at different ages, as well as to the United Kingdom’s obligations under the United Nations Convention on the Rights of the Child. The code interlocks with the existing data protection enforcement mechanism found in the Bill and the GDPR. The Information Commissioner considers many factors in every regulatory decision, and non-compliance with that code will weigh particularly heavily on any organisation that is non-compliant with the GDPR. Organisations that wish to minimise their risk will apply the code. The Government believe that clause 124 is an important and positive addition to the Bill.
Will the Minister say a word about the timetable? When can we expect the consultation and code of practice to be put into operation?
There should be no delay to the development of the code and the consultation that precedes it. If I get any additional detail on the timetable, I will write to the right hon. Gentleman.
Question put and agreed to.
Clause 124, as amended, ordered to stand part of the Bill.
Clause 125
Approval of data-sharing, direct marketing and age-appropriate design codes
Amendment made: 49, in clause 125, page 69, line 9, leave out “with the day on which” and insert “when” —(Margot James.)
This amendment is consequential on Amendment 71.
Clause 125, as amended, order to stand part of the Bill.
Clauses 126 to 130 ordered to stand part of the Bill.
Clause 131
Disclosure of information to the Commissioner
Question proposed, That the clause stand part of the Bill.
Clause 131 deals with disclosure of information to the Information Commissioner, and this is probably a good point at which to ask whether the Information Commissioner has the right level of power to access information that is pertinent to her investigations into the misuse of information. Thanks to The Guardian, The New York Times, and particularly the journalist Carole Cadwalladr, we have had the most extraordinary revelations about alleged misbehaviour at Cambridge Analytica over the past couple of years. Indeed, Channel 4 News gave us further insight into its alleged misdemeanours last night.
We have a situation in social media land that the Secretary of State has described as the “wild west”. Some have unfairly called the Matt Hancock app one of the features of that wild west, but I would not go that far, despite its slightly unusual privacy settings. None the less, there is now cross-party consensus that the regulatory environment that has grown up since the 2000 e-commerce directive is no longer fit for purpose. Yesterday, the Secretary of State helpfully confirmed that that directive will be modernised, and we will come on to discuss new clauses that suggest setting a deadline for that.
One deficiency of today’s regulatory environment is the inadequate power that the Information Commissioner currently has to access information that is important for her investigations. We have a wild west, we have hired a sheriff, but we have not given the sheriff the power to do her job of keeping the wild west in order. We now have the ridiculous situation that the Information Commissioner must declare that she is going to court to get a warrant to investigate the servers of Cambridge Analytica, and to see whether any offence has been committed.
Does my hon. Friend agree that this is also a question of access to the judiciary? Last night, the Information Commissioner had to wait until this morning to get a warrant because no judges or emergency judges were available. At the same time, we assume that Facebook was able to exercise its contractual right to enter the offices of Cambridge Analytica. Emergency judges are available for terrorism or deportation cases. Should there not be access to emergency judges in cases of data misuse for quick regulatory enforcement too?
If I wanted to hide something from a newspaper and I thought that the newspaper was going to print it inappropriately, I would apply for an emergency injunction to stop the newspaper running it. I do not understand why the Information Commissioner has had to broadcast her intentions to the world, because that has given Cambridge Analytica a crucial period of time in which to do anything it likes, frankly, to its data records. The quality of the Information Commissioner’s investigation must be seriously impaired by the time that it has taken to get what is tantamount to a digital search warrant.
Is the Minister satisfied in her own mind that clause 131 and its associated clauses are powerful enough? Will she say more about the Secretary of State’s declaration to the House last night that he would be introducing amendments to strengthen the Commissioner’s power in the way that she requested? When are we going to see those amendments? Are we going to see them before this Committee rises, or at Report stage? Will there be a consultation on them? Is the Information Commissioner going to share her arguments for these extra powers with us and with the Secretary of State? We want to see a strong sheriff patrolling this wild west, and right now we do not know what the Government’s plan of action looks like.
I just want to recap on what clause 131 is about. It is intended to make it clear that a person is not precluded by any other legislation from disclosing to the commissioner information that she needs in relation to her functions, under the Bill and other legislation. The only exception relates to disclosures prohibited by the Investigatory Powers Act 2016 on grounds of national security. It is therefore a permissive provision enabling people to disclose information to the commissioner.
However, the right hon. Member for Birmingham, Hodge Hill has taken the opportunity to question the powers that the Information Commissioner has at her disposal. As my right hon. Friend the Secretary of State said yesterday in the Chamber, we are not complacent. I want to correct something that the right hon. Member for Birmingham, Hodge Hill said. My right hon. Friend did not say that he would table amendments to the Bill on the matter in question. He did say that we were considering the position in relation to the powers of the Information Commissioner, and that we might table amendments, but we are in the process of considering things at the moment. I presume that that goes for the right hon. Gentleman as well; if not, he would surely have tabled his own amendments by now, but he has not.
The Minister will notice that I have tabled a number of new clauses that would, for example, bring election law into the 21st century. I think that the Secretary of State left the House with the impression yesterday that amendments to strengthen the power of the Information Commissioner would be pretty prompt. It is hard to see another legislative opportunity to put that ambition into effect, so perhaps the Minister will tell us whether we can expect amendments soon.
I can certainly reassure the right hon. Gentleman that we are looking at the matter seriously and, although I cannot commit to tabling amendments, I do not necessarily rule them out. I have to leave it at that for now.
On a more positive note, we should at least acknowledge that, although the Bill strengthens the powers of the Information Commissioner, her powers are already the gold standard internationally. Indeed, we must bear it in mind that the data privacy laws of this country are enabling American citizens to take Cambridge Analytica to court over data breaches.
I want to review some of the powers that the Bill gives the commissioner, but before I do so I will answer a point made by the right hon. Member for Birmingham, Hodge Hill. He said that the commissioner had had difficulties and had had to resort to warrants to pursue her investigation into a political party in the UK and both the leave campaigns in the referendum. She is doing all that under existing data protection law, which the Bill is strengthening. That is encouraging.
I did not want to intervene, but I have been struggling with the matter myself. There are allegations that a significant donor to Leave.EU was supported in that financial contribution by organisations abroad. As I spoke to the Financial Conduct Authority and tabled questions to the Treasury, it was revealed that there were no data sharing gateways between the Electoral Commission and the FCA.
I shall come back to the right hon. Gentleman on the relationship between the Information Commissioner and the FCA. I am sure that the information that he has already ascertained from the Treasury is correct, but there may be other ways in which the two organisations can co-operate, if required. The allegations are very serious and the Government are obviously very supportive of the Information Commissioner as she grapples with the current investigation, which has involved 18 information notices and looks as if it will be backed up by warrants as well. I remind the Committee that that is happening under existing data protection law, which the Bill will strengthen.
Question put and agreed to.
Clause 131 accordingly ordered to stand part of the Bill.
(6 years, 8 months ago)
Public Bill CommitteesI will give an example first, because I think it is so important. I fear that a bit of misunderstanding has crept in. Let us take the example of a subject access request. Mr Smith asks an intelligence service whether it is processing personal data concerning him and, if so, for information about that data under clause 94. The intelligence service considers whether it is processing personal data, which it will have obtained under its other statutory powers, such as the Regulation of Investigatory Powers Act 2000 or the Investigatory Powers Act 2016.
If the agency determines that it is processing personal data relating to Mr Smith, it then considers whether it is able to disclose the data, or whether a relevant exemption is engaged. For the agency, the key consideration will be whether disclosing the data would damage national security, for example by disclosing sensitive capabilities or alerting Mr Smith to the fact that he is a subject of investigation. If disclosure does not undermine national security and no other exemption is relevant, the intelligence service must disclose the information. However, if national security would be undermined by disclosure, the agency will need to use the national security exemption in relation to processing any personal data relating to Mr Smith.
If the intelligence service does not process any personal data relating to Mr Smith, it will again have to consider whether disclosing that fact would undermine national security, for example by revealing a lack of capability, which could be exploited by subjects of investigation. That is why, on occasion, when such requests are made, a “neither confirm nor deny” response may be necessary, because either confirming or denying may in itself have ramifications, not only in relation to Mr Smith but in relation to other aspects of national security.
Mr Smith may complain to the Information Commissioner about the response to his request for information. The intelligence service may then be required to demonstrate to the commissioner that the processing of personal data complies with the requirements of part four of the Bill, as set out in clause 102, and that it has responded to the request for information appropriately.
If, in legal proceedings, Mr Smith sought to argue that the national security exemption had been improperly relied upon, a national security certificate could be used as conclusive evidence that the national security exemption was required to safeguard national security. Any person who believed they were directly affected by the certificate could of course appeal against it to the upper tribunal, as set out in clause 111.
The Minister is setting out the mechanics of the system with admirable clarity. The point in dispute, though, is not the mechanics of the process but whether the data controller is able—unilaterally, unchecked and unfettered—to seek a national security exemption. Anyone who has worked with the intelligence agencies, either as a Minister or not, knows that they take parliamentary oversight and the defence of parliamentary supremacy extremely seriously.
What we are seeking with this amendment is to ensure that a data controller does not issue a national security certificate unchecked, and that instead there is an element of judicial oversight. The rule of law is important. It should be defended, protected and enhanced, especially when the data collection powers of the intelligence services are so much greater than they were 30 years ago when data protection legislation was first written.
The Government fully accept that national security certificates should be capable of being subject to judicial oversight. Indeed, the current scheme—both under the 1998 Act and this Bill—provides for just that. However, the amendments would radically change the national security certificate regime, because they would replace the existing scheme with one that required a Minister of the Crown to apply to a judicial commissioner for a certificate if an exemption was sought for the purposes of safeguarding national security, and for a decision to issue a certificate to be approved by a judicial commissioner.
This, again, is the debate that we had when we were considering the Investigatory Powers Act 2016. There were some who would have preferred a judicial commissioner to make the decision about warrantry before the Secretary of State. However, Parliament decided that it was not comfortable with that, because it would have meant a great change. For a member of the judiciary to certify on national security issues, rather than a member of the Executive—namely the Prime Minister or a Secretary of State—would have great constitutional implications.
There were great debates about the issue and the House decided, in its wisdom, that it would maintain the constitutional tradition, which is that a member of the Executive has the ultimate responsibility for national security, with, of course, judicial oversight by judicial commissioners and by the various tribunals that all these powers are subject to. The House decided that the decision itself must be a matter for a Minister of the Crown, because in the event—God forbid—that there is a national security incident, the House will rightly and properly demand answers from the Government of the day. With the greatest respect, a judicial commissioner cannot come to the Dispatch Box to explain how the Government and those assisting them in national security matters have responded to that situation. That is why we have this fine constitutional balance, and why we have adopted in the Bill the regime that has been in place for 30 years.
No, because those who have drafted the Bill have sought, at all times, to comply with the law enforcement directive and with the modernised, draft Council of Europe convention 108. The Bill very much meets those standards, not just on law enforcement but across parts 3 and 4.
I have spoken to the outgoing Council of Europe information commissioner about the issue, and he has put on the record his grave reservations about the regime that we have in place, because we simply do not have the right kind of judicial oversight of the information gathering powers that are now available to our intelligence services. Our intelligence services are very good, and they need to be allowed to do their job, but they will be allowed to do that job more effectively—and without additional risks to our adequacy—if there is some kind of judicial oversight in the right timeframe of the decisions that are taken.
That is where the distinction between obtaining information and processing it is so important. The gathering that the right hon. Gentleman refers to falls under the Investigatory Powers Act 2016. Retaining it and processing it in the ways that the Bill seeks to provide for is the data protection element. The 2016 Act has all the extra judicial oversights that have been passed by the House.
Quite helpfully, we are coming to the nub of the question. It is now incumbent on the Minister to lay out for the Committee why the oversight regime for obtaining information should be so remarkably different from the regime for processing it.
The obtaining of information is potentially intrusive and often extremely time-sensitive. For the processing of information, particularly in the case of a subject access request, once we have met the criteria for obtaining it, separate judicial oversight through the upper tribunal is set out in the Bill, as well as ministerial oversight. They are two separate regimes.
There is extra oversight in the 2016 Act because obtaining information can be so intrusive. The right hon. Gentleman will appreciate that I cannot go into the methodology—I am not sure I am security-cleared enough to know, to be honest—but obtaining information has the potential to be particularly intrusive, in a way that processing information gathered by security service officials may not be.
I reassure the Minister that I went through the methodologies during my time at the Home Office. The justification that she still needs to lay out for the Committee—she is perhaps struggling to do so—is why there should be one set of judicial oversight arrangements for obtaining information and another for processing it. Why are they not the same?
There might be many reasons why we process information. The end result of processing might be for national security reasons or law enforcement reasons—my officials are scribbling away furiously, so I do not want to take away their glory when they provide me with the answer.
I have an answer on the Watson case, raised by the hon. Member for Sheffield, Heeley, which dealt with the retention of communications by communications service providers. Again, that is an entirely different scenario from the one we are talking about, where the material is held by the security services.
Amendment 161 goes further than the 2016 Act, because it places the decision to issue a certificate with the judicial commissioner. As I have said, national security certificates come into play only to serve in legal proceedings as conclusive evidence that an exemption from specified data protection requirements is necessary to protect national security—for example, to prevent disclosure of personal data to an individual under investigation, when such disclosure would damage national security. The certificate does not authorise the required use of the national security exemption, which is properly a matter for the data controller to determine.
Amendments 163 and 164 relate to the form of a national security certificate. Amendment 163 would require a detailed rather than general description of the data identified on a national security certificate, but we believe this change to be unnecessary and unhelpful, given that much data can be adequately described in a general way. Amendment 164, which would prevent a certificate from having prospective effect, appears to be dependent on the prior judicial authorisation scheme proposed in amendments 161 and 162, and again contrasts with the prospective nature of certificates currently under the Data Protection Act 1998.
Prospective certificates of the type issued under the 1998 Act are the best way of ensuring that the use of the national security exemption by the intelligence services and others is both sufficiently foreseeable for the purposes of article 8 of the European convention on human rights, and accountable. The accountability is ensured by the power to challenge certificates when they are issued, and that is something that has real teeth. The accountability is strengthened by the provision in clause 130 for the publication of certificates. The documents we are discussing will therefore be in the public domain—indeed, many of them are already. But it will now be set out in statute that they should be in the public domain.
Amendments 166 to 168 relate to the appeals process. Amendment 166 would broaden the scope for appealing a national security certificate from a person “directly affected” by it to someone who
“believes they are directly or indirectly affected”
by it. I wonder whether the Opposition did any work on the scope of the provision when drafting it, because the words “indirectly affected” have the potential to cause an extraordinary number of claims. How on earth could that phrase be defined in a way that does not swamp the security services with applications from people who consider that they might be indirectly affected by a decision relating to a national security matter? I do not see how that can be considered practicable.
On the judicial review point, the test was debated at length in the Joint Committee, in the Public Bill Committee and on the Floor of the House. The House passed that Act with cross-party consensus, as my hon. Friend has said, so I do not understand why we are having the same debate.
Anyone who has spent time working with our intelligence agencies knows that they see their mission as the defence of parliamentary democracy. They believe in scrutiny and oversight, which is what we are trying to insert in the Bill. The reason the Investigatory Powers Bill was passed in that way was because we were successful in ensuring that there were stronger safeguards. The Minister has been unable to explain today why the safeguarding regime should be different for the processing of data as opposed to the obtaining of data. We have heard no convincing arguments on that front today. All that we are seeking to do is protect the ability of the intelligence agencies to do their job by ensuring that a guard against the misuse of their much broader powers is subject to effective judicial oversight, and not in public but in a court.
For the security services to have obtained data under the Investigatory Powers Act, they will have passed through the various safeguards that Parliament set out in that Act. Once that data is obtained, it follows that the permission that the judicial commissioner will have reviewed will still flow through to the processing of that information. Our concern here is certain requirements of the data protection regime. The decision to disseminate information under that regime must rest with the intelligence agencies, with oversight. The Bill provides for those decisions to be appealed. That is as it should be. It should not be for a judicial commissioner to take over the decision of the data controller, who is processing applications and information in real time, often in situations that require them to act quickly. Likewise, whether to grant a certificate, which will be in the public domain, must be a decision for a member of the Executive, not the judiciary.
I assume that no work has been done to measure the scope of amendment 166, but allowing the clause to cover people indirectly affected could have enormous consequences for the security services, which already face great pressures and responsibilities.
Amendments 167 and 168 would remove the application of judicial review principles by the upper tribunal when considering an appeal against a certificate. They would replace the “reasonable grounds for issuing” test with a requirement to consider whether issuing a certificate was necessary and proportionate. Again, that would be an unnecessary departure from the existing scheme, which applies the judicial review test and has worked very well for the past 30 years.
In applying judicial review principles, the upper tribunal can consider a range of issues, including necessity, proportionality and lawfulness. As we set out in our response to the report of the House of Lords Constitution Committee, that enables the upper tribunal to consider matters such as whether the decision to issue the certificate was reasonable, having regard to the impact on the rights of the data subject and the need to safeguard national security. The Bill makes it clear that the upper tribunal has the power to quash the certificate if it concludes that the decision to issue it was unreasonable.
I hope that I have answered the concerns of the right hon. Member for Birmingham, Hodge Hill about how certificates are granted and about the review process when a subject access request is made and the certificate is applied. We must recognise that the Bill does not weaken a data subject’s rights or the requirements that must be met if an exemption is to be relied on; it reflects the past 30 years of law. Perhaps I missed it, but I do not think that any hon. Member has argued that the Data Protection Act 1998 has significant failings.
As the Minister well knows, the debate internationally is a result of the radical transformation of intelligence agencies’ ability to collect and process data. There is an argument, which has been well recognised in the Council of Europe and elsewhere, that where powers are greater, oversight should be stronger.
Yes, and that is precisely why Parliament passed the Investigatory Powers Act 2016.
The safeguards that apply once the information has been obtained—
Order. I realise that the right hon. Gentleman feels strongly about the issue, but if he wishes to intervene, he must stand. If not, he must remain quiet and take it on the chin.
Members will note that there are a number of clauses on the selection list to which no amendments have been tabled. I propose to start grouping such clauses together in order to speed progress. However, Members still have the right to tell me that they wish to speak to, or vote on, an individual clause.
Clauses 28 and 29 ordered to stand part of the Bill.
Clause 30
Meaning of “competent authority”
Amendments made: 18, in clause 30, page 19, line 4, after “specified” insert “or described”.
This amendment changes a reference to persons specified in Schedule 7 into a reference to persons specified or described there.
Amendment 19, in clause 30, page 19, line 10, leave out from “add” to end of line and insert
“or remove a person or description of person”.—(Margot James.)
This amendment makes clear that regulations under Clause 30 may identify a person by describing a type of person, as well as by specifying a person.
Clause 30, as amended, ordered to stand part of the Bill.
Schedule 7 agreed to.
Clauses 31 to 34 ordered to stand part of the Bill.
Clause 35
The first data protection principle
Question proposed, That the clause stand part of the Bill.
Very briefly, subsection (1) includes the phrase
“must be lawful and fair”.
Could the Minister say a little more about the word “fair”? What definition is she resting on, and who is the judge of it?
“Lawful” means any processing necessary to carry out a particular task, where that task is authorised either by statute or under common law. It would cover, for example, the taking and retention of DNA and fingerprints under the Police and Criminal Evidence Act 1984, or the police’s common law powers to disclose information required for the operation of the domestic violence disclosure scheme.
The Government recognise the importance of safeguarding sensitive personal information about individuals. Subsections (3) to (5) therefore restrict the processing of sensitive data, the definition of which includes information about an individual’s race or ethnic origin, and biometric data such as their DNA profile and fingerprints.
Further safeguards for the protection of sensitive personal data are set out in clause 42. The processing of sensitive personal data is permitted under two circumstances. The first is where the data subject has given his or her consent. The second is where the processing is strictly necessary for a law enforcement purpose and one or more of the conditions in schedule 8 to the Bill has been met. Those conditions include, for example, that the processing is necessary to protect the individual concerned or another person, or is necessary for the administration of justice. In both cases, the controller is required to have an appropriate policy document in place. We will come on to the content of such policy documents when we debate clause 42.
I am grateful for the Minister’s extensive definition, given in response to a question I did not ask. I did not ask for the definition of “lawful” but for the definition of “fair”.
I am so sorry; I thought it was apparent from my answer. “Fair” is initially a matter for the data controller, but ultimately the Information Commissioner has oversight of these provisions and the commissioner will cover that in her guidance.
Question put and agreed to.
Clause 35 accordingly ordered to stand part of the Bill.
Schedule 8
Conditions for sensitive processing under Part 3
Amendment made: 116, in schedule 8, page 184, line 32, at end insert—
“Safeguarding of children and of individuals at risk
3A (1) This condition is met if—
(a) the processing is necessary for the purposes of—
(i) protecting an individual from neglect or physical, mental or emotional harm, or
(ii) protecting the physical, mental or emotional well-being of an individual,
(b) the individual is—
(i) aged under 18, or
(ii) aged 18 or over and at risk,
(c) the processing is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) the processing is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) For the purposes of this paragraph, an individual aged 18 or over is “at risk” if the controller has reasonable cause to suspect that the individual—
(a) has needs for care and support,
(b) is experiencing, or at risk of, neglect or physical, mental or emotional harm, and
(c) as a result of those needs is unable to protect himself or herself against the neglect or harm or the risk of it.
(4) In sub-paragraph (1)(a), the reference to the protection of an individual or of the well-being of an individual includes both protection relating to a particular individual and protection relating to a type of individual.”—(Victoria Atkins.)
Schedule 8 makes provision about the circumstances in which the processing of special categories of personal data is permitted. This amendment adds to that Schedule certain processing of personal data which is necessary for the protection of children or of adults at risk. See also Amendments 85 and 117.
Schedule 8, as amended, agreed to.
Clauses 36 to 40 ordered to stand part of the Bill.
Clause 41
Safeguards: archiving
Amendment made: 20, in clause 41, page 23, line 34, leave out “an individual” and insert “a data subject”.—(Victoria Atkins.)
Clause 41 makes provision about the processing of personal data for archiving purposes, for scientific or historical research purposes or for statistical purposes. This amendment aligns Clause 41(2)(b) with similar provision in Clause 19(2).
Question proposed, That the clause, as amended, stand part of the Bill.
We had a good debate on what I think was a shared objective across the Committee: to ensure that those running our big national archives—whether they are large or small organisations—should not be jeopardised by frivolous claims or, indeed, a multiplicity of claims from individuals who might seek to change the records held there in one way or another. I mentioned to the Minister in an earlier debate that we were anxious, despite the reassurances she sought to give the Committee, that a number of organisations, including the BBC, were deeply concerned about the Bill’s impact on their work. They were not satisfied that the exemptions and safeguards in the Bill would quite do the job.
My only reason for speaking at this stage is to suggest to Ministers that if they were to have discussions with some of those organisations about possible Government amendments on Report to refine the language, and provide some of the reassurance people want, that would attract our support. We would want to have such conversations, but it would be better if the Government could find a way to come forward with refinements of their own on Report.
I am happy to explore that. The reason for the clause is to enable processing to be done to create an archive for scientific or historical research, or for statistical purposes. The reason law enforcement is mentioned is that it may be necessary where a law enforcement agency needs to review historic offences, such as allegations of child sexual exploitation. I would of course be happy to discuss that with the right hon. Gentleman to see whether there are further avenues down which we should proceed.
I am grateful to the Minister for that response. I am happy to write to her with the representations that we have received, and perhaps she could reflect on those and write back.
Question put and agreed to.
Clause 41, as amended, accordingly ordered to stand part of the Bill.
Clause 42
Safeguards: sensitive processing
Amendment made: 21, in clause 42, page 24, line 29, leave out “with the day” and insert “when”.—(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 42, as amended, ordered to stand part of the Bill.
Clauses 43 to 46 ordered to stand part of the Bill.
Clause 47
Right to erasure or restriction of processing
I beg to move amendment 22, in clause 47, page 28, line 20, leave out second “data”.
This amendment changes a reference to a “data controller” into a reference to a “controller” (as defined in Clauses 3 and 32).
I can be brief, because this drafting amendment simply ensures that clause 47, as with the rest of the Bill, refers to a “controller” rather than a “data controller”. For the purposes of part 3, a controller is defined in clause 32(1) so it is not necessary to refer elsewhere to a “data controller”.
Amendment 22 agreed to.
Clause 47, as amended, ordered to stand part of the Bill.
Clause 48 ordered to stand part of the Bill.
Clause 49
Right not to be subject to automated decision-making
Question proposed, That the clause stand part of the Bill.
We had a good debate on possible amendments to the powers of automatic decision making earlier and this is an important clause in that it creates a right not to be subject to automated decision making. Clause 49(1) states:
“A controller may not take a significant decision based solely on automated processing unless that decision is required or authorised by law.”
I hope Ministers recognise that
“required or authorised by law”
is an incredibly broad set of questions. I would like to provoke the Minister into saying a little more about what safeguards she believes will come into place to ensure that decisions are not taken that jeopardise somebody’s human rights and their right to appeal and justice based on those human rights. It could be that the Minister decides to answer those questions in the debate on clause 50, but it would be useful for her to say a little more about her understanding of the phrase “significant decision” and a little more about what kind of safeguards will be needed to ensure that decisions that are cast in such a broad way do not impact on people in a negative way.
Clause 49 establishes the right for individuals not to be subject to a decision based exclusively on automated processing, where that decision has an adverse impact on the individual. It is important to protect that right to enhance confidence in law enforcement processing and safeguard individuals against the risk that a potentially damaging decision is taken without human intervention. The right hon. Gentleman asked about the definition of a significant decision. It is set out in the Bill.
We are not aware of any examples of the police solely using automated decision-making methods, but there may be examples in other competent authorities. The law enforcement directive includes that requirement, so we want to transpose it faithfully into statute, and we believe we have captured the spirit of the requirement.
I wonder whether that is captured in the spirit of the Bill. Forgive me, Mr Hanson. This is my first Bill Committee as a Minister and I was not aware of that. Many apologies.
I am not familiar with that example. It would be a very interesting exercise under the PACE custody arrangements. I will look into it in due course. These protections transpose the law enforcement directive, and we are confident that they meet those requirements.
Question put and agreed to.
Clause 49 accordingly ordered to stand part of the Bill.
Clause 50
Automated decision-making authorised by law: safeguards
Amendments made: 23, in clause 50, page 30, line 11, leave out “21 days” and insert “1 month”.
Clause 50(2)(b) provides that where a controller notifies a data subject under Clause 50(2)(a) that the controller has taken a “qualifying significant decision” in relation to the data subject based solely on automated processing, the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.
Amendment 24, in clause 50, page 30, line 17, leave out “21 days” and insert “1 month”.—(Victoria Atkins.)
Clause 50(3) provides that where a data subject makes a request to a controller under Clause 50(2)(b) to reconsider or retake a decision based solely on automated processing, the controller has 21 days to respond. This amendment extends that period to one month.
Question proposed, That the clause, as amended, stand part of the Bill.
I remain concerned that the safeguards the Government have proposed to ensure people’s human rights are not jeopardised by the use of automated decision making are, frankly, not worth the paper they are written on. We know that prospective employers and their agents use algorithms and automated systems to analyse very large sets of data and, through the use of artificial intelligence and machine learning, make inferences about whether people are appropriate to be considered to be hired or retained by a particular company. We have had a pretty lively debate in this country about the definition of a worker, and we are all very grateful to Matthew Taylor for his work on that question. Some differences emerged, and the Business, Energy and Industrial Strategy Committee has put its views on the record.
The challenge is that our current labour laws, which were often drafted decades ago, such as the Sex Discrimination Act 1975 and the Race Relations Act 1965, are no longer adequate to protect people in this new world, in which employers are able to use such large and powerful tools for gathering and analysing data, and making decisions.
We know that there are problems. We already know that recruiters use Facebook to seek candidates in a way that routinely discriminates against older workers by targeting job advertisements. That is not a trivial issue; it is being litigated in the United States. In the United Kingdom, research by Slater and Gordon, a group of employment lawyers, found that one in five bosses admits to unlawful discrimination when advertising jobs online. Women and people over 50 are most likely to be stopped from seeing an advert. Around 32% of company executives admitted to discriminating among those over 50; 23% discriminated against women; and 62% of executives who had access to profiling tools admitted to using them to actively seek out people based on criteria such as age, gender and race. Female Uber drivers earn 7% less than men when pay is determined by algorithms. A number of practices in the labour market are disturbing and worrying, and they should trouble all of us.
The challenge is that clause 50 needs to include a much more comprehensive set of rights and safeguards. It should clarify that the Equality Act 2010 and protection from discrimination applies to all new forms of decision making that engage core labour rights around recruitment, terms of work or dismissal. There should be new rights about algorithmic fairness at work to ensure equal treatment where an algorithm or automated system takes a decision that impinges on someone’s rights. There should be a right to explanation where significant decisions are taken based on an algorithm or an automated decision. There is also a strong case to create a duty on employers, if they are a large organisation, to undertake impact assessments to check whether they are, often unwittingly, discriminating against people in a way that we think is wrong.
Over the last couple of weeks, we have seen real progress in the debate about gender inequalities in pay. Many of us will have looked in horror at some of the news that emerged from the BBC and at some of the evidence that emerged from ITV and The Guardian. We have to contend with the reality that automated decision-making processes are under way in the labour market that could make inequality worse rather than better. The safeguards that we have in clause 50 do not seem up to the job.
I hope the Minister will say a bit more about the problems that she sees with future algorithmic decision making. I am slightly troubled that she is unaware of some live examples in the Home Office space in one of our most successful police forces, and there are other examples that we know about. Perhaps the Minister might say more about how she intends to improve the Bill with regard to that issue between now and Report.
I will pick up on the comments by the right hon. Gentleman, if I may.
In the Durham example given by the hon. Member for Sheffield, Heeley, I do not understand how a custody sergeant could sign a custody record without there being any human interaction in that decision-making process. A custody sergeant has to sign a custody record and to review the health of the detainee and whether they have had their PACE rights. I did not go into any details about it, because I was surprised that such a situation could emerge. I do not see how a custody sergeant could be discharging their duties under the Police and Criminal Evidence Act 1984 if their decision as to custody was based solely on algorithms, because a custody record has to be entered.
This has been a moment of genuine misunderstanding. Given how the hon. Lady presented that, to me it sounded as if she was saying that the custody record and the custody arrangements of a suspect—detaining people against their will in a police cell—was being done completely by a computer. That was how it sounded. There was obviously an area of genuine misunderstanding, so I am grateful that she clarified it. She intervened on me when I said that we were not aware of any examples of the police solely using automated decision making—that is when she intervened, but that is not what she has described. A human being, a custody sergeant, still has to sign the record and review the risk assessment to which the hon. Lady referred. The police are using many such examples nowadays, but the fact is that a human being is still involved in the decision-making process, even in the issuing of penalties for speeding. Speeding penalties may be automated processes, but there is a meaningful element of human review and decision making, just as there is with the custody record example she gave.
There was a genuine misunderstanding there, but I am relieved, frankly, given that the right hon. Member for Birmingham, Hodge Hill was making points about my being unaware of what is going on in the Home Office. I am entirely aware of that, but I misunderstood what the hon. Lady meant and I thought she was presenting the custody record as something that is produced by a machine with no human interaction.
Line-by-line scrutiny, but I was acting in good faith on an intervention that the hon. Member for Sheffield, Heeley made when I was talking about any examples of the police solely using automated decision making.
May I ask for your guidance on this question? We are in a Bill Committee that is tasked with scrutinising the Bill line by line. Is it customary for Ministers to refuse to give way on a matter of detail?
Ultimately, whether the Minister gives way is a matter for the Minister—that is true for any Member who has the Floor—but it is normal practice to debate aspects of legislation thoroughly. Ultimately, however, it remains the choice of the Minister or any other Member with the Floor whether to give way.
I have lost track of which point the right hon. Gentleman wants me to give way on.
Let me remind the Minister. What we are concerned about on the question of law enforcement is whether safeguards that are in place will be removed under the Bill. That is part and parcel of a broader debate that we are having about whether the safeguards that are in the Bill will be adequate. So let me return to the point I made earlier to the Minister, which is that we would like her reflections on what additional safeguards can be drafted into clauses 50 and 51 before Report stage.
Clause 49 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly or adversely has an impact on them, legally or otherwise, unless required by law. If that decision is required by law, clause 50 specifies the safeguards that controllers should apply to ensure that the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and giving that individual 21 days in which to ask the controller to reconsider the decision, or to retake the decision with human intervention.
A point was made about the difference between automated processing and automated decision making. Automated processing is when an operation is carried out on personal data using predetermined fixed parameters that allow for no discretion by the system and do not involve further human intervention in the operation to produce a result or output. Such processing is used regularly in law enforcement to filter large datasets down to manageable amounts for a human operator to use. Automated decision making is a form of automated processing that allows the system to use discretion, potentially based on algorithms, and requires the final decision to be made without human interference. The Bill seeks to clarify that, and the safeguards are set out in clause 50.
Question put and agreed to.
Clause 50, as amended, accordingly ordered to stand part of the Bill.
Clause 51
Exercise of rights through the Commissioner
These technical amendments are required to ensure that the provisions in clause 51 do not inadvertently undermine criminal investigations by the police or other competent authorities. Under the Bill, where a person makes a subject access request, it may be necessary for the police or other competent authority to give a “neither confirm nor deny” response, for example in order to avoid tipping someone off that they are under investigation for a criminal offence. In such a case, the data subject may exercise their rights under clause 51 to ask the Information Commissioner to check that the processing of their personal data complies with the provisions in part 3. It would clearly undermine a “neither confirm nor deny” response to a subject access request if a data subject could use the provisions in part 3 to secure confirmation that the police were indeed processing their information.
It is appropriate that the clause focuses on the restriction of a data subject’s rights, not on the underlying processing. The amendments therefore change the nature of the request that a data subject may make to the commissioner in cases where rights to information are restricted under clause 44(4) or clause 45(4). The effect of the amendments is that a data subject will be able to ask the commissioner to check that the restriction was lawful. The commissioner will then be able to respond to the data subject in a way that does not undermine the original “neither confirm nor deny” response.
This is a significant amendment—I understand the ambition behind the clause—so it is worth dwelling on it for a moment. I would like to check my understanding of what the Minister said. In a sense, if an investigation is under way and the individual under investigation makes a subject access request to the police and gets a “neither confirm nor deny” response, the data subject will be able to ask the Information Commissioner to investigate. Will the Minister say a little more about what message will go from the police to the Information Commissioner and the content of the message that will go from the Information Commissioner to the data subject? I have worked on such cases in my constituency. Often, there is an extraordinary spiral of inquiries and the case ultimately ends up in a judicial review in court. Will the Minister confirm that I have understood the mechanics accurately and say a little more about the content of the messages from the police to the Information Commissioner and from the Information Commissioner to the person who files the request?
I can help the right hon. Gentleman in one respect: he has understood the mechanics. I am afraid that I cannot give him examples, because it will depend on the type of criminal offence or the type of investigation that may be under way. I cannot possibly give him examples of the information that may be sent by the police to the Information Commissioner, because that will depend entirely on the case that the police are investigating.
Perhaps I can pose the question in a sharper way. I do not think that is entirely the case. It must be possible for the Minister to be a little more specific, and perhaps a little more knowledgeable, about the content of the message that will go from the Information Commissioner to the data subject. Will that be a standard message? Will it be in any way detailed? Will it reflect in any way on the information that the police provide? Or will it simply be a blank message such as “I, the Information Commissioner, am satisfied that your information has been processed lawfully”? I do not think the Information Commissioner is likely to ask for too much detail about the nature of the offence, but she will obviously ask whether data has been processed lawfully. She will want to make checks in that way. Unless the Information Commissioner is able to provide some kind of satisfactory response to the person who has made the original request, we will end up with an awful administrative muddle that will take of lot of the courts’ time. Perhaps the Minister could put our minds at rest on that.
The Information Commissioner will get the information but, by definition, she does not give that information to the subject, because law enforcement will have decided that it meets the criteria for giving a “neither confirm nor deny” response from their perspective. The commissioner then looks at the lawfulness of that; if she considers it to be lawful, she will give the same response—that the processing meets part 3 obligations.
Amendment 25 agreed to.
Amendment made: 26, in clause 51, page 31, line 11, leave out from first “the” to end of line 12 and insert “restriction imposed by the controller was lawful;” —(Victoria Atkins.)
This amendment is consequential on Amendment 25.
Clause 51, as amended, ordered to stand part of the Bill.
Clause 52 ordered to stand part of the Bill.
Clause 53
Manifestly unfounded or excessive requests by the data subject
Amendments made: 27, in clause 53, page 31, line 39, leave out “or 47” and insert “,47 or 50”.
Clause 53(1) provides that where a request from a data subject under Clause 45, 46 or 47 is manifestly unfounded or excessive, the controller may charge a reasonable fee for dealing with the request or refuse to act on the request. This amendment applies Clause 53(1) to requests under Clause 50 (automated decision making). See also Amendment 28.
Amendment 28, in clause 53, page 32, line 4, leave out “or 47” and insert “,47 or 50”.—(Victoria Atkins.)
Clause 53(3) provides that where there is an issue as to whether a request under Clause 45, 46 or 47 is manifestly unfounded or excessive, it is for the controller to show that it is. This amendment applies Clause 53(3) to requests under Clause 50 (automated decision making). See also Amendment 27.
Question proposed, That the clause, as amended, stand part of the Bill.
We have just agreed a set of amendments that, on the face of it, look nice and reasonable. We can all recognise the sin that the Government are taking aim at, and that the workload of the Information Commissioner’s Office and of others has to be kept under control, so we all want to deter tons of frivolous and meaningless requests. None the less, a lot of us have noticed that, for example, the introduction of fees for industrial tribunals makes it a lot harder for our constituents to secure justice.
I wonder, having now moved the amendment successfully, whether the Minister might tell us a little more about what will constitute a reasonable fee and what will happen to those fees. Does she see any relationship between the fees being delivered to her Majesty’s Government and the budget that is made available for the Information Commissioner? Many of us are frankly worried, given the new obligations of the Information Commissioner, about the budget she has to operate with and the resources at her disposal. Could she say a little more, to put our minds at rest, and reassure us that these fees will not be extortionate? Where sensible fees are levied, is there some kind of relationship with the budget that the Information Commissioner might enjoy?
Clause 35 establishes the principle that subject access requests should be provided free of charge in most cases. That will be the default position in most cases. In terms of the fees, that will not be a matter to place in statute; certainly, I can write to the right hon. Gentleman with my thoughts on how that may develop. The intention is that in the majority of cases, there will be no charge.
Question put and agreed to.
Clause 53, as amended, accordingly ordered to stand part of the Bill.
Clause 54
Meaning of “applicable time period”
Amendments made: 29, in clause 54, page 32, line 14, leave out “day” and insert “time”.
This amendment is consequential on Amendment 71.
Amendment 30, in clause 54, page 32, line 15, leave out “day” and insert “time”.—(Victoria Atkins.)
This amendment is consequential on Amendment 71.
Clause 54, as amended, ordered to stand part of the Bill.
Clauses 55 to 63 ordered to stand part of the Bill.
Clause 64
Data protection impact assessment
I rise briefly to support my hon. Friend’s excellent speech. The ambition of Opposition Members on the Committee is to ensure that the Government have in place a strong and stable framework for data protection over the coming years. Each of us, at different times in our constituencies, have had the frustration of working with either local police or their partners and bumping into bits of regulation or various procedures that we think inhibit them from doing their job. We know that at the moment there is a rapid transformation of policing methods. We know that the police have been forced into that position, because of the pressure on their resources. We know that there are police forces around the world beginning to trial what is sometimes called predictive policing or predictive public services, whereby, through analysis of significant data patterns, they can proactively deploy police in a particular way and at a particular time. All these things have a good chance of making our country safer, bringing down the rate of crime and increasing the level of justice in our country.
The risk is that if the police lack a good, clear legal framework that is simple and easy to use, very often sensible police, and in particular nervous and cautious police and crime commissioners, will err on the side of caution and actually prohibit a particular kind of operational innovation, because they think the law is too muddy, complex and prone to a risk of challenge. My hon. Friend has given a number of really good examples. The automatic number plate recognition database is another good example of mass data collection and storage in a way that is not especially legal, and where we have waited an awfully long time for even something as simple as a code of practice that might actually put the process and the practice on a more sustainable footing. Unless the Government take on board my hon. Friend’s proposed amendments, we will be shackling the police, stopping them from embarking on many of the operational innovations that they need to start getting into if they are to do their job in keeping us safe.
I will speak briefly in support of amendments 142 to 149, as well as new clauses 3 and 4. As it stands, clause 64 requires law enforcement data controllers to undertake a data protection impact assessment if
“a type of processing is likely to result in a high risk to the rights and freedoms of individuals”.
That assessment would look at the impact of the envisaged processing operations on the protection of personal data and at the degree of risk, measures to address those risks and possible safeguards. If the impact assessment showed a high risk, the controller would have to consult the commissioner under clause 65.
It is important to be clear that the assessment relates to a type of processing. Nobody is asking anyone to undertake an impact assessment every time the processing occurs. With that in mind, the lower threshold for undertaking an assessment suggested in the amendments seems appropriate. We should be guarding not just against probable or high risks, but against any real risk. The worry is that if we do not put these tests in place, new forms of processing are not going to be appropriately scrutinised. We have had the example of facial recognition technology, which is an appropriate one.
New clauses 3 and 4 do a similar job for the intelligence services in part 4, so they also have our support.
I am extremely grateful to the hon. Lady for clarifying her role. My answer is exactly as I said before. High risk includes processing where there is a particular likelihood of prejudice to the rights and freedoms of data subjects. That must be a matter for the data controller to assess. We cannot assess it here in Committee for the very good reason put forward by members of the Committee: we cannot foresee every eventuality. Time will move on, as will technology. That is why the Bill is worded as it is, to try to future-proof it but also, importantly, because the wording complies with our obligations under the law enforcement directive and under the modernised draft Council of Europe convention 108.
Does the Minister not have some sympathy with the poor individuals who end up being data controllers for our police forces around the country, given the extraordinary task that they have to do? She is asking those individuals to come up with their own frameworks of internal guidance for what is high, medium and low risk. The bureaucracy-manufacturing potential of the process she is proposing will be difficult for police forces. We are trying to help the police to do their job, and she is not making it much easier.
I remain concerned that the Bill leaves gaps that will enable law enforcement agencies and the police to go ahead and use technology that has not been tested and has no legal basis. As my right hon. Friend the Member for Birmingham, Hodge Hill said, that leaves the police open to having to develop their own guidance at force level, with all the inconsistencies that would entail across England and Wales.
The Minister agreed to write to me on a couple of issues. I do not believe that the Metropolitan police consulted the Information Commissioner before trialling the use of photo recognition software, and I do not believe that other police forces consulted the Information Commissioner before rolling out mobile fingerprint scanning. If that is the case and the legislation continues with the existing arrangements, that is not sufficient. I hope that before Report the Minister and I can correspond so as potentially to strengthen the measures. With that in mind, and with that agreement from the Minister, I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 64 ordered to stand part of the Bill.
Clauses 65 and 66 ordered to stand part of the Bill.
Clause 67
Notification of a personal data breach to the Commissioner
Question proposed, That the clause stand part of the Bill.
The Committee is looking for some guidance and for tons of reassurance from the Minister about how the clause will bite on data processors who do not happen to base their operations here in the United Kingdom. This morning we debated the several hundred well-known data breaches around the world and highlighted some of the more recent examples, such as Yahoo!—that was probably the biggest—and AOL. More recently, organisations such as Uber have operated their systems with such inadequacy that huge data leaks have occurred, directly infringing the data protection rights of citizens in this country. The Minister will correct me if I am wrong, but I am unaware of any compensation arrangements that Uber has made with its drivers in this country whose data was leaked.
Even one of the companies closest to the Government—Equifax, which signed a joint venture agreement with the Government not too long ago—has had a huge data breach. It took at least two goes to get a full account from Equifax of exactly what had happened, despite the fact that Her Majesty’s Government were its corporate partner and had employed it through the Department for Work and Pensions. All sorts of information sharing happened that never really came to light. I am not sure whether any compensation for Equifax data breaches has been paid to British citizens either.
My point is that most citizens of this country have a large amount of data banked with companies that operate from America under the protection of the first amendment. There is a growing risk that in the years to come, more of the data and information service providers based in the UK will go somewhere safer, such as Ireland, because they are worried about the future of our adequacy agreement with the European Commission. We really need to understand in detail how the Information Commissioner, who is based here, will take action on behalf of British citizens against companies in the event of data breaches. For example, how will she ensure notification within 72 hours? How will she ensure the enforcement of clause 67(4), which sets out the information that customers and citizens must be told about the problem?
This morning we debated the Government’s ludicrous proposals for class action regimes, which are hopelessly inadequate and will not work in practice. We will not have many strong players in the UK who are able to take action in the courts, so we will be wholly reliant on the Information Commissioner to take action. I would therefore be grateful if the Minister reassured the Committee how the commissioner will ensure that clause 67 is enforced if the processor of the data is not on our shores.
The right hon. Gentleman refers to companies not on these shores, about which we had a good deal of discussion this morning. Clause 67 belongs to part 3 of the Bill, which is entitled “Law enforcement processing”, so I am not sure that the companies that he gives as examples would necessarily be considered under it. I suppose a part 3 controller could have a processor overseas, but that would be governed by clause 59. Enforcement action would, of course, be taken by the controller under part 3, but I am not sure that the right hon. Gentleman’s examples are relevant to clause 67.
I am grateful to the Minister for that helpful clarification. Let me phrase the question differently, with different examples. The Home Office and many police forces are outsourcing many of their activities, some of which are bound to involve data collected by global organisations such as G4S. Is she reassuring us that any and all data collected and processed for law enforcement activities will be held within the boundaries of the United Kingdom and therefore subject to easy implementation of clause 67?
The controller will be a law enforcement agency, to which part 3 will apply. I note that clause 200 provides details of the Bill’s territorial application should a processor be located overseas, but under part 3 it will be law enforcement agencies that are involved.
Where G4S, for example, is employed to help with deportations, the Minister is therefore reassuring us that the data controller would never be G4S. However, if there were an activity that was clearly a law enforcement activity, such as voluntary removal, would the data controller always be in Britain and therefore subject to clause 67, even where private sector partners are used? The Minister may outsource the contract, but we want to ensure that she does not outsource the role of data controller so that a law enforcement activity here can have a data controller abroad.
I appreciate the sentiment behind the amendment. If the Home Office outsources processing to an overseas company, any enforcement action would be taken against the Home Office as the controller. The right hon. Gentleman has raised the example of G4S in the immigration context, so I will reflect on that overnight and write to him to ensure that the answer I have provided also covers that situation.
Question put and agreed to.
Clause 67 accordingly ordered to stand part of the Bill.
Clause 68 to 71 ordered to stand part of the Bill.
Clause 72
Overview and interpretation
Question proposed, That the clause stand part of the Bill.
I want to flag up an issue that we will stumble across in a couple of stand part debates: the safeguards that will be necessary for data sharing between this country and elsewhere. We will come on to the safeguards that will be necessary for the transfer of data between our intelligence agencies and foreign intelligence agencies. Within the context of this clause, which touches on the broad principle of data sharing from here and abroad, I want to rehearse one or two arguments on which Ministers should be well briefed and alert.
Our intelligence agencies do an extraordinary job in keeping this country safe, which sometimes involves the acquisition and use of data that results in the loss of life. All Committee members will be familiar with the drone strike that killed Reyaad Khan and Ruhul Amin, and many of us will have heard the Prime Minister’s assurances in the Liaison Committee about the robust legal process that was gone through to ensure that the strike was both proportionate and legal.
The challenge—the public policy issue that arises under chapter 5 of the Bill—is that there is a number of new risks. First, there is the legal risk flagged up by the Court of Appeal in 2013, when justices said that it was not clear that UK personnel will be immune from criminal liability for their involvement in a programme that involves the transfer of intelligence from an intelligence service here to an American partner and where that American partner uses that information to conduct drone strikes that involve the loss of life. Confidence levels differ, but we in the Committee are pretty confident about the legal safeguards around those kinds of operations in this country. We can be less sure about the safeguards that some of our partners around the world have in place. The Court of Appeal has expressed its view, which was reinforced in 2016 by the Joint Committee on Human Rights. The Committee echoed the finding that
“front-line personnel…should be entitled to more legal certainty”
than they have today.
This section of the Bill gives us the opportunity to ensure that our intelligence services are equipped with a much more robust framework than they have today, to ensure that they are not subject to the risks flagged by the Court of Appeal or by the Joint Committee on Human Rights.
(6 years, 8 months ago)
Public Bill CommitteesIt is a pleasure to serve under your chairmanship, Mr Hanson. I am pleased to introduce this group of amendments, which relate to data processing for safeguarding purposes. The amendments respond to an issue raised in an amendment tabled by Lord Stevenson on Report in the Lords in December. In response to that amendment, Lord Ashton made it clear that the Government are sympathetic to the points Lord Stevenson raised and undertook to consider the matter further. Amendments 85, 116 and 117 are the result of that consideration.
I am grateful to Lord Stevenson for raising this issue, and for his contribution to what is probably the most important new measure that we intend to introduce to the Data Protection Bill. The amendments will ensure that sensitive data can be processed without consent in certain circumstances for legitimate safeguarding activities that are in the substantial public interest. We have been working across government and with stakeholders in the voluntary and private sectors to ensure that the amendments are fit for purpose and cover the safeguarding activities expected of organisations responsible for children and vulnerable adults.
The Government recognise that statutory guidance and regulator expectations place moral, if not legal, obligations on certain organisations to ensure that measures are in place to safeguard children and vulnerable adults. Amendment 85 covers processing that is necessary for protecting children and vulnerable adults from neglect or physical or mental harm. This addresses the gap in relation to expectations on, for example, sports governing bodies.
The Government have produced cross-agency and cross-governmental guidance called “Working Together to Safeguard Children”, which rightly places the responsibility of safeguarding children on all relevant professionals who come into contact with children and families. For example, it creates an expectation that those volunteering at a local sports club will assess the needs of children and, importantly, will take action to protect them from abuse.
Amendment 85 permits the processing of sensitive personal data, which is necessary to safeguard children from physical, emotional, sexual and neglect-based abuse. Amendment 84 makes a consequential drafting change, while amendments 116 and 117 make an analogous change to the regimes in parts 3 and 4 of the Bill. This is aimed at putting beyond doubt a controller’s ability to safeguard children and people at risk.
I thought an example might help the Committee to understand why we place such an emphasis on the amendments. An example provided by a sports governing body is that a person may make an allegation or complaint about a volunteer that prompts an investigation. Such investigations can include witness statements, which reference sensitive personal data, including ethnicity, religious or philosophical beliefs, sexual orientation and health data.
In some instances, the incident may not reach a criminal standard. In those cases, the sports body may have no legal basis for keeping the data. Keeping a record allows sports bodies to monitor any escalation in conduct and to respond appropriately. Forcing an organisation to delete this data from its records could allow individuals that we would expect to be kept away from children to remain under the radar and potentially leave children at risk.
Amendment 86 deals with a related issue where processing health data is necessary to protect an individual’s economic wellbeing, where that individual has been identified as an individual at economic risk. UK banks have a number of regulatory obligations and expectations which are set out in the Financial Conduct Authority’s rules and guidance. In order to meet best practice standards in relation to safeguarding vulnerable customers, banks occasionally need to record health data without the consent of the data subject.
An example was given of a bank which was contacted by a family member who was alerting the bank to an elderly customer suffering from mental health problems who was drawing large sums of money each day from their bank account and giving it away to a young drug addict whom they had befriended. The bank blocked the account while the family sought power of attorney. Again, the amendment seeks to clarify the position and give legal certainty to banks and other organisations where that sort of scenario arises or where, for example, someone suffers from dementia and family members ask banks to take steps to protect that person’s financial wellbeing.
The unfortunate reality is that there still exists a great deal of uncertainty under current law about what personal data can be processed for safeguarding purposes. My brief of crime, vulnerability and safeguarding means that all too often—perhaps in the context of domestic abuse—agencies will gather, sadly, to conduct a domestic homicide review and discover that had certain pieces of information been shared more freely, perhaps more action could have been taken by the various agencies and adults and children could have been safeguarded.
These amendments are aimed at tackling these issues. We want to stop the practice whereby some organisations have withheld information from the police and other law enforcement agencies for fear of breaching data protection law and other organisations have been unclear as to whether consent to processing personal data is required in circumstances where consent would not be reasonable or appropriate. The amendments intend to address the uncertainty by providing relevant organisations with a specific processing condition for processing sensitive personal data for safeguarding purposes. I beg to move.
I rise to put on record my thanks to the Minister for listening carefully to my noble Friend Lord Stevenson. There was strong cross-party consensus on these common-sense reforms.
We all know that in our own constituencies there are extraordinary people doing extraordinary things in local groups. They are the life-blood of our communities. Many of them will be worried about the new obligations that come with the general data protection regulation and many of them will take a least-risk approach to meeting the new regulations. Putting in place some common safeguards to ensure that it is possible to keep data that allow us to spot important patterns of behaviour that can lead to appropriate investigations is very sensible and wise. These amendments will therefore be made with cross-party support.
Amendment 84 agreed to.
Amendments made: 85, in schedule 1, page 126, line 38, at end insert—
“Safeguarding of children and of individuals at risk
14A (1) This condition is met if—
(a) the processing is necessary for the purposes of—
(i) protecting an individual from neglect or physical, mental or emotional harm, or
(ii) protecting the physical, mental or emotional well-being of an individual,
(b) the individual is—
(i) aged under 18, or
(ii) aged 18 or over and at risk,
(c) the processing is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) the processing is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) For the purposes of this paragraph, an individual aged 18 or over is “at risk” if the controller has reasonable cause to suspect that the individual—
(a) has needs for care and support,
(b) is experiencing, or at risk of, neglect or physical, mental or emotional harm, and
(c) as a result of those needs is unable to protect himself or herself against the neglect or harm or the risk of it.
(4) In sub-paragraph (1)(a), the reference to the protection of an individual or of the well-being of an individual includes both protection relating to a particular individual and protection relating to a type of individual.”
Part 2 of Schedule 1 describes types of processing of special categories of personal data which meet the requirement in Article 9(2)(g) of the GDPR (processing necessary for reasons of substantial public interest) for a basis in UK law (see Clause 10(3)). This amendment adds to Part 2 of Schedule 1 certain processing of personal data which is necessary for the protection of children or of adults at risk. See also Amendments 116 and 117.
Amendment 86, in schedule 1, page 126, line 38, at end insert—
“Safeguarding of economic well-being of certain individuals
14B (1) This condition is met if the processing—
(a) is necessary for the purposes of protecting the economic well-being of an individual at economic risk who is aged 18 or over,
(b) is of data concerning health,
(c) is carried out without the consent of the data subject for one of the reasons listed in sub-paragraph (2), and
(d) is necessary for reasons of substantial public interest.
(2) The reasons mentioned in sub-paragraph (1)(c) are—
(a) in the circumstances, consent to the processing cannot be given by the data subject;
(b) in the circumstances, the controller cannot reasonably be expected to obtain the consent of the data subject to the processing;
(c) the processing must be carried out without the consent of the data subject because obtaining the consent of the data subject would prejudice the provision of the protection mentioned in sub-paragraph (1)(a).
(3) In this paragraph, “individual at economic risk” means an individual who is less able to protect his or her economic well-being by reason of physical or mental injury, illness or disability.”—(Victoria Atkins.)
Part 2 of Schedule 1 describes types of processing of special categories of personal data which meet the requirement in Article 9(2)(g) of the GDPR (processing necessary for reasons of substantial public interest) for a basis in UK law (see Clause 10(3)). This amendment adds to Part 2 of Schedule 1 certain processing of personal data which is necessary to protect the economic well-being of adults who are less able to protect their economic well-being by reason of a physical or mental injury, illness or disability.
I beg to move amendment 150, page 126, line 38, at end insert—
“Register of missing persons
14A This condition is met if the processing—
(a) is necessary for the establishment or maintenance of any register of missing persons, and
(b) is carried out in a manner which is consistent with any guidance which may be issued by the Secretary of State or by the Commissioner on the processing of data for the purposes of this paragraph.”
It is a pleasure to serve under your chairmanship, Mr Hanson. Amendment 150 seeks to provide a similar exemption to the one that the Minister has just laid out. As my right hon. Friend the Member for Birmingham, Hodge Hill said, we completely support the principles behind this exemption to schedule 1. As the Minister made clear, too often serious case reviews or reviews after an incident of this nature, particularly in child protection cases, show clearly that if the data had been shared more effectively—often in health cases—the child could have been protected and their life might have been saved.
We tabled this amendment because of the increase in the number of missing persons and missing children over the past few years. As the shadow Police Minister, I approach this issue from a policing perspective. It is important that all data handlers fully understand their obligations and the powers that are bestowed on them. Too often, under the existing legislation, they hide behind data protection to avoid sharing data, and we fear that that tendency will become even stronger under the Bill.
Sharing data relating to missing persons is important for a number of reasons. The demand on police services from such cases has rocketed over the past few years. Police officers spend only 17% of their time responding to crime, so 83% of police time is spent responding to non-crime demand. That includes mental health call-outs, but largely it relates to missing persons. Some police forces tell me that missing persons place the greatest demand on their time.
In the west midlands, since 2015 the number of missing person incidents has doubled to nearly 13,000 cases a year. In Northumbria—one of the smallest police forces in the country—as of this minute there are 43 men and 20 women missing. For such a small police force, that is a significant number of people to be out looking for. Last year alone, such investigations cost the police service more than £600 million. One fifth of those missing persons are children in care, more than 50% are children, and a significant proportion are elderly people missing from care. Crucially, about one third are reported missing on more than one occasion. It is those individuals we seek to address with the register.
There are various reasons for the increase, one of which is certainly better police reporting. Our ageing population means that more people are in care and are going missing from care. The police have responded to that issue in various ways, including by tagging elderly individuals who go missing from care repeatedly —we have tabled amendments to explore the issues arising from that. Cuts to other public services mean that the increasing demand, which previously would have fallen elsewhere—in particular, on local authorities—is now landing on the police. We are seeing a higher tolerance of risk across the care sector, and possibly the health sector too, and a tendency to pass the buck for these issues and other vulnerabilities on to the police, who have a very low risk threshold and nowhere to pass them on.
I believe we need a review of all agencies that are involved with safeguarding to ensure that they are taking seriously their responsibilities in this regard. When the issue relates to resources, they must make the case for those resources, rather than merely pass the problem on to the police. I have heard stories about private children’s care homes where staff may see that the child is outside their window or down the street, but because they are five minutes over curfew they ring the police and say that the child is missing. That passes on the responsibility, but has very serious implications for the police. It diverts resources from tackling crime and from responding to genuine cases of missing children and high-risk missing persons.
Estimates of the time associated with this activity suggest that approximately 18 hours of police time is needed for a medium-risk missing persons investigation. In 2015-16, that equated to more than 6 million investigation hours, or more than 150,000 officers occupied full time with that activity. Not being dealt with by the appropriate agency and not being responded to correctly has real implications for the individual. Going missing can be a precursor to various aspects of significant harm, such as abuse, exposure to criminal activity and mental ill-health. There are enough issues relating to police forces sharing data among themselves, let alone with other agencies. As a result, various criminal activities exploiting those weaknesses have developed. In the past, the Minister and I have discussed county lines at length, which is a criminal activity whereby organised criminal gangs exploit children. They take them, internally traffick them across the country, set them up in another vulnerable adult’s home and leave them to deal drugs on their behalf. That is a very profitable criminal activity, but the perpetrators have been able to evade real enforcement because of the weaknesses in data sharing and cross-agency working between police forces and agencies. The amendment will ensure that the police and all appropriate safeguarding agencies have access to the relevant data to ensure that at-risk missing people are found as quickly and safely as possible, and have their needs dealt with in the most appropriate way.
Following engagement with local government stakeholders, we have recognised that the maximum time period permitted for responses to the subject access request set out in parts 3 and 4 of the Data Protection Bill subtly differs from that permitted under the GDPR and part 2 of the Bill. That is because the GDPR and, by extension, part 2 rely on European rules for calculating time periods, whereas parts 3 and 4 implicitly rely on a more usual domestic approach. European law, which applies to requests under part 2, says that when one is considering a time period in days, the day on which the request is received is discounted from the calculation of that time period. In contrast, the usual position under UK law, which applies to requests under parts 3 and 4 of the Bill, is that that same seven-day period to respond would begin on the day on which the request was received. In a data protection context, that has the effect of providing those controllers responding to requests under parts 3 and 4 with a time period that is one day shorter in which to respond.
To provide consistency across the Bill, we have decided to include a Bill-wide provision that applies the European approach to all time periods throughout the Bill, thus ensuring consistency with the directly applicable GDPR. Having a uniform approach to time periods is particularly helpful for bodies with law enforcement functions, which will process personal data under different regimes under the Bill. Without these amendments, different time periods would apply, depending on which regime they were processing under. Ensuring consistency for calculating time periods will also assist the information commissioner with her investigatory activities and enforcement powers, for example by avoiding the confusion and potential disputes that could arise relating to her notices or requests for information.
Amendment 71 provides for a number of exemptions to the European approach where deviating from our standard approach to time periods would be inappropriate. For example, where the time period refers to the process of parliamentary approval of secondary legislation, it would clearly not be appropriate to deviate from usual parliamentary time periods. The unfortunate number of amendments in this group comes from the need to modify existing language on time periods, currently worded for compliance with the usual UK approach, so that it applies the approach of the EU rules instead. I hope that this has provided the Committee with sufficient detail on the reasons for tabling this group of amendments.
Amendment 92 agreed to.
Question proposed, That the schedule, as amended, be the First schedule to the Bill.
We had a useful debate this morning about the whys and wherefores of whether the article 8 right to privacy should be incorporated into the Bill. Although we were disappointed by the Minister’s reply, what I thought was useful in the remarks she made was a general appreciation of the importance of strong data rights if the UK is to become a country with a strong environment of trust within which a world of digital trade can flourish.
I will briefly alert the Minister to a debate we want to have on Report. The reality is that we feel schedule 1 is narrowly drawn. It is an opportunity that has been missed, and it is an opportunity for the Minister to come back on Report with a much more ambitious set of data rights for what will be a digital century. When we look around the world at the most advanced digital societies, we can see that a strong regime of data rights is common to them all.
I was recently in Estonia, which I hope the Minister will have a chance to visit if she has not done so already. Estonia likes to boast of its record as the world’s most advanced digital society; it is a place where 99% of prescriptions are issued online, 95% of taxes are paid online and indeed a third of votes are cast online. It is a country where the free and open right to internet access is seen as an important social good, and a good example of a country that has really embraced the digital revolution and translated that ambition into a set of strong rights.
The Government are not averse to signing declaratory statements of rights that they then interpret into law. They are a signatory to the UN universal declaration of human rights and the UN convention on the rights of the child; the Human Rights Act 1998 is still in force—I have not yet heard of plans to repeal it—and of course the Equality Act 2010 was passed with cross-party support. However, those old statements of rights, which date back to 1215, were basically to correct and guard against dangerous imbalances of power. Things have moved on since 1215 and the worries that the barons had about King John. We are no longer as concerned as people were in 1215 about taking all the fish weirs out of the Thames, for example.
I understand the hon. Gentleman’s concerns. The GDPR requires data controls to have a legal basis laid down in law, which can take the form, for example, of a statutory power or duty, or a common-law power. Any organisation that does not have such legal basis would have to rely on one of the other processing conditions in article 6. With regard to the amendment that was agreed to this morning, we think that further restricting clause 8 might risk excluding bodies with a lawful basis for processing. However, the hon. Gentleman is free to raise the issue again on Report.
Question put and agreed to.
Schedule 1, as amended, accordingly agreed to.
Clauses 11 to 13 ordered to stand part of the Bill.
Clause 14
Automated decision-making authorised by law: safeguards
I beg to move amendment 153, in clause 14, page 7, line 30, at end insert—
“(1A) A decision that engages an individual’s rights under the Human Rights Act 1998 does not fall within Article 22(2)(b) of the GDPR (exception from prohibition on taking significant decisions based solely on automated processing for decisions that are authorised by law and subject to safeguards for the data subject’s rights, freedoms and legitimate interests).”
This amendment would clarify that the exemption from prohibition on taking significant decisions based solely on automated processing must apply to purely automated decisions that engage an individual’s human rights.
With this it will be convenient to discuss the following:
Amendment 130, in clause 14, page 7, line 34, at end insert—
“(2A) A decision that engages an individual’s rights under the Human Rights Act 1998 does not fall within Article 22(2)(b) of the GDPR (exception from prohibition on taking significant decisions based solely on automated processing for decisions that are authorised by law and subject to safeguards for the data subject’s rights, freedoms and legitimate interests).
(2B) A decision is “based solely on automated processing” for the purposes of this section if, in relation to a data subject, there is no meaningful input by a natural person in the decision-making process.”
This amendment would ensure that where human rights are engaged by automated decisions these are human decisions and provides clarification that purely administrative human approval of an automated decision does make an automated decision a ‘human’ one.
Amendment 133, in clause 50, page 30, line 5, at end insert “, and
(c) it does not engage the rights of the data subject under the Human Rights Act 1998.”
This amendment would ensure that automated decisions should not be authorised by law if they engage an individual’s human rights.
Amendment 135, in clause 96, page 56, line 8, after “law” insert
“unless the decision engages an individual’s rights under the Human Rights Act 1998”.
The amendments touch on what I am afraid will become an increasing part of our lives in the years to come: the questions of what decisions can be taken by algorithms; where such decisions are taken, what rights we have to some kind of safeguards, such as a good old-fashioned human being looking over the decision that is taken and the outcomes that arise; and whether we are content to acquiesce in the rule of the robots.
In a number of areas of our lives—particularly our economic and social lives—such algorithms will become more and more important. Algorithms are already used to screen job applications, for example, and to create shortlists of candidates for interview. Insurance companies use them to adjudge what premiums someone should enjoy, or whether they should be offered insurance at all. The challenge of algorithms was put best by my hon. Friend the Member for Cambridge on Second Reading: the great risk of such developments is that old injustice is hard-coded into new injustice.
That is particularly troubling when we think about the provisions and exemptions the Government have brought forward that allow the automatic processing of data in public services. Many public servants around the world are beginning to look at predictive public services and how algorithms can scan great swathes of, for example, health data and crime data, and make decisions about where police should attend, who should or should not get bail, who should be added to police databases such as the gangs matrix, and how healthcare should be targeted in parts of the country or to what kinds of families. There are great risks in algorithms taking decisions in ways ungoverned by us. As parliamentarians, we have a particular duty to ensure that the appropriate safeguards are in place.
Clauses 14 and 15 allow automated processes where they are authorised by law. That creates the obligation of giving notice and what is, in effect, an ex post facto right of appeal. The Opposition’s argument is somewhat different: it is better not to take decisions on the basis of automatic processing of data where those decisions affect our human rights.
They say that to err is human, but to really mess things up you need a computer. We all know from our casework, whether about the benefits or the social care system or any other kind of system that constituents might name, that sometimes the most terrible, egregious errors are made. We also know that sometimes it is very difficult for citizens to seek remedies for those problems. Very often, the reason they have come to see us in our surgeries is because, as they so often say to us, we are the last port of call and the last hope that is kicking around; if we cannot fix it, frankly, our constituent is about to give up. That is an unfortunate situation that we do not want to see multiply.
The amendments relate to automated decision making under the GDPR and the Bill. It is a broad category, which includes everything from trivial things such as music playlists, as mentioned by the hon. Member for Argyll and Bute, and quotes for home insurance, to the potentially more serious issues outlined by the right hon. Member for Birmingham, Hodge Hill of recruitment, healthcare and policing cases where existing prejudices could be reinforced. We are establishing a centre, the office for artificial intelligence and data ethics, and are mindful of these important issues. We certainly do not dismiss them whatsoever.
Article 22 of the GDPR provides a right not to be subject to a decision based solely on automatic processing of data that results in legal or similarly significant effects on the data subject. As is set out in article 22(2)(b), that right does not apply if the decision is authorised by law, so long as the data subject’s rights, freedoms and legitimate interests are safeguarded.
The right hon. Member for Birmingham, Hodge Hill, mentioned those safeguards, but I attribute far greater meaning to them than he implied in his speech. The safeguards embed transparency, accountability and a right to request that the decision be retaken, and for the data subject to be notified should a decision be made solely through artificial intelligence.
The Minister must realise that she is risking an explosion in the number of decisions that have to be taken to Government agencies or private sector companies for review. The justice system is already under tremendous pressure. The tribunal system is already at breaking point. The idea that we overload it is pretty optimistic. On facial recognition at public events, for example, it would be possible under the provisions that she is proposing for the police to use facial recognition technology automatically to process those decisions and, through a computer, to have spot interventions ordered to police on the ground. The only way to stop that would be to have an ex post facto review, but that would be an enormous task.
The right hon. Gentleman should be aware that just because something is possible, it does not mean that it is automatically translated into use. His example of facial recognition and what the police could do with that technology would be subject to controls within the police and to scrutiny from outside.
As the hon. Lady says, the police are trialling those things. I rest my case—they have not put them into widespread practice as yet.
Returning to the GDPR, we have translated the GDPR protections into law through the Bill. As I said, the data subject has the right to request that the decision be retaken with the involvement of a sentient individual. That will dovetail with other requirements. By contrast, the amendments are designed to prevent any automated decision-making from being undertaken under article 22(2)(b) if it engages the rights of the data subject under the Human Rights Act 1998.
Will the Minister explain to the Committee how a decision to stop and search based on an automated decision can be retaken? Once the person has been stopped and searched, how can that activity be undone?
I am not going to get into too much detail. The hon. Member for Sheffield, Heeley mentioned an area and I said that it was just a trial. She said that facial recognition was being piloted. I do not dispute that certain things cannot be undone. Similar amendments were tabled in the other place. As my noble Friend Lord Ashton said there, they would have meant that practically all automated decisions under the relevant sections were prohibited, since it would be possible to argue that any decision based on automatic decision making at the very least engaged the data subject’s right to have their private life respected under article 8 of the European convention on human rights, even if it was entirely lawful under the Act.
I fear that the Minister is taking some pretty serious gambles on the application of this technology in the future. We think it is the business of this place to ensure that our citizens have strong safeguards, so we will put the amendment to a vote.
Question put, That the amendment be made.
Amendments 10, 11 and 12 relate to clause 14, which requires a data controller to notify a data subject of a decision based solely on automatic processing as soon as is reasonably practicable. The data subject may then request that the data controller reconsider such a decision and take a new decision not based solely on automated processing.
The purpose of the amendments is to bring clause 14 into alignment with the directly applicable time limits in article 12 of the GDPR, thereby ensuring that both data subjects and data controllers have easily understandable rights and obligations. Those include giving the data subject longer to request that a decision be reconsidered, requiring that the controller action the request without undue delay and permitting an extension of up to two months where necessary.
Furthermore, to ensure that there is consistency across the different regimes in the Bill—not just between the Bill and the GDPR—amendments 23, 24, 41 and 42 extend the time limit provisions for making and responding to requests in the other regimes in the Bill. That is for the simple reason that it would not be right to have a data protection framework that applies one set of time limits to one request and a different set of time limits to another.
In a similar vein, amendments 27 and 28 amend part 3 of the Bill, concerning law enforcement processing, to ensure that controllers can charge for manifestly unfounded or excessive requests for retaking a decision, as is permitted under article 12 of the law enforcement directive. To prevent abuse, amendment 28 provides that it is for the controller to be able to show that the request was manifestly unfounded or excessive.
It would be useful if the Minister could say a little more about the safeguards around the controllers charging reasonable fees for dealing with requests.
It is quite easy to envisage situations where algorithms take decisions. We have some ex post facto review; a citizen seeks to overturn the decision; the citizen thinks they are acting reasonably but the commercial interest of the company that has taken and automated the decision means that it wants to create disincentives for that rigmarole to unfold. That creates the risk of unequal access to justice in these decisions.
If the Minister is not prepared to countenance the sensible safeguards that we have proposed, she must say how she will guard against another threat to access to justice.
The right hon. Gentleman asks a reasonable question. I did not mention that data subjects have the right of complaint to the Information Commissioner if the provisions are being abused. I also did not mention another important safeguard, which is that it is for the data controller to show that the request is manifestly unfounded or excessive. So the burden of proof is on the data controller and the data subject has the right of involving the Information Commissioner, if he or she contests the judgment taken in this context, concerning unfounded or excessive requests in the opinion of the data controller. I hope that satisfies the right hon. Gentleman.
Amendment 10 agreed to.
Amendments made: 11, in clause 14, page 8, leave out line 10 and insert “within the period described in Article 12(3) of the GDPR—”
This amendment removes provision from Clause 14(5) dealing with the time by which a controller has to respond to a data subject’s request under Clause 14(4)(b) and replaces it with a requirement for the controller to respond within the time periods set out in Article 12(3) of the GDPR, which is directly applicable.
Amendment 12, in clause 14, page 8, line 16, at end insert—
‘(5A) In connection with this section, a controller has the powers and obligations under Article 12 of the GDPR (transparency, procedure for extending time for acting on request, fees, manifestly unfounded or excessive requests etc) that apply in connection with Article 22 of the GDPR.” —(Margot James.)
This amendment inserts a signpost to Article 12 of the GDPR which is directly applicable and which confers powers and places obligations on controllers to whom Clause 14 applies.
Clause 14, as amended, ordered to stand part of the Bill.
Clause 15
Exemptions etc.
I beg to move amendment 156, in schedule 2, page 136, line 30, leave out paragraph 4.
This amendment would remove immigration from the exemptions from the GDPR.
We are trying to provide some careful and considered constraints on the exemptions that the Government are asking for, in particular the exemptions that Ministers seek for the purposes of immigration control.
The Bill has been drafted essentially to enable the Home Office to do two things: win cases and create a hostile environment for those who are here illegally, where it has no capacity to trace and deport individuals. In conducting its work, the Home Office draws on a wide range of private providers, from G4S to Cifas. They have a mixed record, including on data protection. The carve-out that the Government seek for immigration purposes has caused widespread concern. It has drawn concern from the other place, the Information Commissioner and the Joint Committee on Human Rights.
The Minister will try to assure us by saying there are safeguards wrapped around the exemption and that there are limits on the way it can be used, but those limits are drawn so vaguely and broadly that they are not safeguards at all. They have been drafted to apply where matters are likely to prejudice immigration control. Who gets to judge the likelihood of prejudicing immigration control is not terrifically clear. In my Home Office days, we used to call that carte blanche.
Through the powers and exemptions in the Bill, the Home Office seeks to collect data for one purpose and then use it without informed consent. Where the rubber hits the road is that, crucially, the effect will be to ensure that subject access requests are basically put beyond the scope of someone seeking information that they might be able to use either in representations that we all might make to Ministers or, more importantly, in an immigration tribunal.
I want to sound a warning note to the Minister, as I hinted on Second Reading. I was brought into the Home Office as a Minister in 2006 and, after a glorious fortnight as Minister for Police and Counter-terrorism, I was moved by my boss John Reid to become Immigration Minister, where I was asked to conduct the biggest shake-up of our immigration system for 40 years.
I created the UK Border Agency; I took UK visas out of the Foreign Office; I took Customs out of the Treasury. We created a Border Agency that could run a biometric visa programme abroad, checking fingerprints against police national computers before anyone got on a train, plane or boat to our country. We introduced much stronger controls at the border, increasing those nice new blue signs, creating smart uniforms for immigration officials, and we increased immigration policing by around £100 million a year
I said earlier that to err is human but it takes a computer really to foul things up. That is a lesson that I learned with some force during my time at the Home Office. The dedicated, fantastic officials in the Home Office and the extraordinary officers who work in what was the UK Border Agency—it has since been revised a couple of times—do an amazing job. They are dramatically underfunded by the Treasury. They have been underfunded by the Treasury under this Government and, in my view, we did not get enough out of the Treasury in my day.
However, they are human and make mistakes. That is why we have such a complicated immigration tribunal system, where people can take their complaints to a first tier tribunal but very often need to seek a judicial review down the line. The challenge is that, if the Home Office wants to create a process and an administration for making the right decision, which can be defended in a tribunal and in a judicial review case, that process must be robust. When we streamlined the immigration tribunal system, we realised that we had to change, improve and strengthen the way that we took decisions in the Home Office because too many were made in a way that was not JR-proof. We were losing JRs and therefore denying justice to those who brought a legitimate claim against the Crown.
There were occasions when I lost cases because of information that was disclosed to the applicant through a subject access review. SARs are one of the most powerful instruments by which anybody in this country, whether a citizen or someone applying to become a citizen, or applying for a legal right to remain, can acquire information that is crucial to the delivery of justice. Many of us are incredibly sympathetic to the job that the Home Office does. Many of us will want a tougher regime in policing immigration, in particular illegal immigration, but I suspect every member of the Committee is also interested in the good conduct of justice and administrative justice. As someone who served in the Home Office for two years, I had to take some very difficult decisions, including to release subject access request information that I absolutely did not want to go into the public domain. Sometimes it was right to release that information because it helped ensure that justice was done in the courts of this land.
The Minister has some very strong safeguards in the Bill. There are strong safeguards that create exemptions for her where the interest is in crime prevention, such as, for example, illegal immigration. However, the power that the provision seeks, at which we take aim in our amendments, is a step too far and risks the most terrible injustices. It risks the courts being fouled up and our being challenged in all sorts of places, including the European Court of Human Rights in the years to come. It is an unwise provision. If I were a Home Office official, I would have tried it on—I would have tried to get it through my Minister and through the Houses of Parliament, but it is unwise and a step too far. I hope the Minister will accept the amendment and delete the provisions.
I will speak in favour of amendment 156. On Second Reading, I said that I would raise this matter again in Committee and I make no apologies for doing so. We regard this new exemption as extremely concerning. It permits the Government to collect and hold data for the purposes of what they describe as “effective immigration”.
It also concerns me that nowhere in the Bill does there seem to be a legal definition of effective immigration control. I am worried that “effective immigration control” is highly subjective and highly politicised. It exposes individuals, weakens their rights and makes them vulnerable to whatever change in the political tide happens to come along next. This broad-ranging exemption is fundamentally unfair. It is open to abuse and runs contrary to safeguarding basic human rights. I believe that the UK’s proposed immigration exemption goes much further than the scope of restrictions afforded to member states under GDPR, with all the consequences of that, which we discussed in such great detail this morning around adequacy decisions.
The Under-Secretary of State will know better than anybody that there are very tight time limits over the windows within which people can ask for entry clearance officer reviews or reconsideration, either by an immigration official or, in extremis, by the Minister. How long will the pause last, and can she guarantee the Committee today that the pause will never jeopardise the kick-in of time limits on an appeal or a reconsideration decision?
The reason for the pause is—I will give case studies of this—to enable the immigration system to operate. If someone has gone missing, requests for data will be required to find that person. Once that person is found, and there is no longer a need to apply the exemption, it will be lifted.
That is not an answer to my question. I am asking for a guarantee to the Committee this afternoon that the pause will never jeopardise somebody’s ability to submit a valid request for a reconsideration or an appeal with the information that they need within the time windows set out by Home Office regulations—yes or no.
I am asked whether this will have an impact on someone’s application, either at appeal or reconsideration. Of course, information is obtained so that a person can be brought in. As I say, I will make it clear with case studies, so perhaps I can answer the right hon. Gentleman in more detail when I give such an example, but the purpose of this is generally to find a person. When the need, as set out under the exemption, no longer exists, the rights kick back in again. This relates only to the first two data protection principles under the GDPR. Again, I will go into more detail in a moment, but this is not the permanent exemption from rights as perhaps has been feared by some; it is simply to enable the process to work. Once a person has been brought into the immigration system, all the protections of the immigration system remain.
I will move on to the case studies in a moment, as I have given way several times. First, I will lay out the titles, then I will come on to article 23. Again, our analysis is that the provision fits within one of the exemptions in article 23. That is precisely the reason that we have drawn it in this way.
We very much welcome the enhanced rights and protections for data subjects afforded by the GDPR. The authors of the GDPR accepted that at times those rights need to be qualified in the general public interest, whether to protect national security, the prevention and detection of crime, the economic interests of the country or, in this case, the maintenance of an effective system of immigration control. Accordingly, a number of articles of the GDPR make express provision for such exemptions, including article 23(1)(e), which enables restrictions to be placed on certain rights of data subjects. Given the extension of data subjects’ rights under the GDPR, it is necessary to include in the Bill an explicit targeted but proportionate exemption in the immigration context.
The exemption would apply to the processing of personal data by the Home Office for the purposes of
“the maintenance of effective immigration control, or…the investigation or detection of activities that would undermine the maintenance of effective immigration control”.
It would also apply to other public authorities required or authorised to share information with the Department for either of those specific purposes.
Let me be clear on what paragraph 4 of schedule 2 does not do. It categorically does not set aside the whole of the GDPR for all processing of personal data for all immigration purposes. It makes it clear that the exemption applies only to certain GDPR articles. The articles that the exemption applies to are set out in paragraph 4(2) of schedule 2. They relate to various rights of data subjects provided for in chapter 3 of the GDPR, such as the rights to information and access to personal data, and to two of the data protection principles—namely the first one, which relates to fair and transparent processes, and the purpose limitation, which is the second one.
As I understand it, the derogations that are sought effectively remove the right to information in article 13; the right to information where data is obtained from a third party in article 14; the right of subjects’ access in article 15; the right to erasure in article 17; the right to restriction of processing in article 18; the right to object in article 21(1); the principle of lawful, fair and transparent processing in article 5; the principle of purpose limitation in article 5(1)(b); and the data protection principles in article 5 of lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, integrity, confidentiality and accountability to the extent that they correspond to the rights above. That is a pretty broad set of rights to be cast out.
Those are not the data protection principles. If one continues to read on to paragraph 4(2)(b) of schedule 2, it sets out the two data protection principles that I have just highlighted. The provisions set out in sub-paragraph (2)(a) relate to the data protection principles of fair and transparent processing and the purpose limitation. As I say, this is not a permanent removal. This is, as we describe it, a pause. There is not a free hand to invoke the permitted exception as a matter of routine.
All of the data protection principles, including those relating to data minimisation, accuracy, storage limitation and integrity and confidentiality, will continue to apply to everyone. So, too, will all the obligations on data controllers and processors, all the safeguards around cross-border transfers, and all the oversight and enforcement powers of the Information Commissioner. The latter is particularly relevant here, as it is open to any data subject affected by the provisions in paragraph 4 of schedule 2 to make a complaint to the Information Commissioner that the commissioner is then under a duty to investigate. Again, I hope that that addresses some of the concerns that the hon. Member for Argyll and Bute raised.
Contrary to the impression that has perhaps been given or understood, paragraph 4 does not give the Home Office a free hand to invoke the permitted exceptions as a matter of routine. The Bill is clear that the exceptions may be applied only to the extent that the application of the rights of data subjects, or the two relevant data protection principles, would be likely to prejudice
“the maintenance of effective immigration control, or…the investigation or detection of activities that would undermine the maintenance of effective immigration control”.
That is an important caveat.
The Minister will know that in paragraph 2(1)(a) we already have a set of exemptions that relate to the prevention or detection of a crime, including, presumably, all of the crimes that fall into the bucket of organising or perpetrating illegal immigration. Despite constant pressing during the debate in the other place and here, we have not yet had a clear answer as to why additional powers and exemptions are needed, over and above the powers expressly granted and agreed in paragraph 2(1)(a).
I am grateful to the right hon. Gentleman for raising that issue, because it allows me to get to the nub of how we approach the immigration system. We do not see the immigration system as some form of criminality or as only being open to the principles of criminal law. He will know that we deal with immigration in both the civil law and criminal law contexts. The exemption he has raised in terms of paragraph 2 of the schedule deals with the criminal law context, but we must also address those instances where the matter is perhaps for civil law.
We know that in the vast majority of immigration cases, people are dealt with through immigration tribunals or through civil law. They are not dealt with through criminal law. That is the point; we must please keep open the ability to deal with people through the civil law system, rather than rushing immediately to criminalise them. If, for example, they have overstayed, sometimes it is appropriate for the criminal law to become involved, but a great number of times it is for the civil law to be applied to deal with that person’s case either by way of civil penalty or by finding an arrangement whereby they can be given discretion to leave or the right to remain. We have the exemption in paragraph 4 so that we do not just focus on the criminal aspects that there may be in some immigration cases. We must ensure that we also focus on the much wider and much more widely used civil law context.
It is important to recognise that the exemptions will not and cannot be targeted at whole classes of vulnerable individuals, be they victims of domestic abuse or human trafficking, undocumented children or asylum seekers. The enhanced data rights afforded by the GDPR will benefit all those who are here lawfully in the United Kingdom, including EU citizens. The relevant rights will be restricted only on a case-by-case basis where there is evidence that the prejudice I have mentioned is likely to occur.
If someone has overstayed, they have committed a crime. Therefore, paragraph 2(1)(a) absolutely bites. We are seeking to prevent that crime. Someone who has overstayed their visa has committed a crime. It is kind of as simple as that.
In that scenario, we may well effect their removal administratively. It does not mean that it is going through the criminal courts.
By way of a second example, take a case where the Home Office is considering an application for an extension of leave to remain in the UK. It may be that we have evidence that the applicant has provided false information to support his or her claim. In such cases, we may need to contact third parties to substantiate the veracity of the information provided in support of the application. If we are then obliged to inform the claimant that we are taking such steps, they may abscond and evade detection.
If someone has submitted false information in support of an application to the Government, and signed it, as they must, that is called fraud. That is also a crime, and is covered by paragraph 2(1)(a).
I take the right hon. Gentleman’s point, particularly in relation to the overstayer, but as the purpose of processing personal data in many immigration areas is not generally the pursuit of criminal enforcement action, it is not clear that it would be appropriate in all cases to rely on crime-related exemptions, where the real prejudice lies in our ability to take administrative enforcement action. It may well be that in some cases a crime has been committed, but that will not always be the case.
Criminal sanctions are not always the correct and proportionate response to people who are in the UK without lawful authority. It is often better to use administrative means to remove such a person and prevent re-entry, rather than to deploy the fully panoply of the criminal justice system, which is designed to rehabilitate members of our communities. As the purpose of processing personal data in such cases is not generally the pursuit of a prosecution, it is not clear that we could, in all cases, rely on that exemption relating to crime.
If I may, I will continue with my speech, because I have more information to give. Perhaps at the end I can deal with the hon. Gentleman’s point.
I just want to dissolve one confusion in the Minister’s remarks. The nature of the Home Office response, whether it is a prosecution through a civil court, a civil sanction or a civil whatever else, does not affect the nature of the offence that is committed. The Home Office has a range of sanctions and choices in responding to an offence, but that does not stop the offence being an offence. The offence is still a crime, and is therefore covered by paragraph 2(1)(a).
The right hon. Gentleman is assuming that each and every immigration case that will be covered by these provisions necessitates the commission of a crime.
I would not make that assumption. The vast majority of immigration cases are dealt with in a civil context.
No—the child is not missing, but the parent is; so we seek advice from the Department for Education about where the child is. It may be that cleverer lawyers than me in the Home Office will find an exemption for that, but the point of this exemption of paragraph 4 is to cover the lawfulness of the Home Office in seeking such information in order to find parents or responsible adults who may have responsibility, and either to regularise their stay or to remove them.
I encourage the right hon. Member for Birmingham, Hodge Hill to withdraw his amendment, as we believe that it is not the wholesale disapplication of data subjects’ rights, and it is a targeted provision wholly in accordance with the discretion afforded to member states by the GDPR and is vital to maintaining the integrity and effectiveness of our immigration system.
Anyone who was not alarmed by this provision certainly will leave this Committee Room thoroughly alarmed by the Minister’s explanations.
First, we were invited to believe that we could safeguard due process and the rights of newcomers to this country by suspending those rights and pursuing people through civil court. We were then asked to believe that the Home Office’s ambition to deal with these cases with civil response rendered inoperable the powers set out in paragraph 2(1)(a), confusing the response from the Home Office and the nature of the offence committed up front. Then, we were invited to believe that this was not a permanent provision—even though that safeguard is not written into the Bill—but a temporary provision. What is not clear is when those temporary provisions would be activated and, crucially, when they would be suspended.
I am happy to give way in a moment. Most of us here who have done our fair share of immigration cases—I have done several thousand over the last 14 years—know that on some occasions, the Home Office interpretation of time is somewhat different from a broadly understood interpretation of time. I have cases in which a judge has ordered the issue of a visa, and six months later we are still chasing the Home Office for the issue of the visa. I will not be alone in offering these examples.
Perhaps when the Minister intervenes, she could set out what “temporary” means, where it is defined and where are the limits, and she still has not answered my question whether she will guarantee that the implementation of this pause will not jeopardise someone’s ability to submit either a request for an entry clearance officer review or an appeal within the legally binding time windows set out in Home Office regulations.
The key to this is the purpose for which we are processing the data. Even if there are criminal sanctions, that does not mean that we are processing for that purpose, particularly where we are not likely to pursue a prosecution. The primary purpose is often immigration control—that does not fit under paragraph 2 as he has described it—rather than enforcing the criminal justice system. That is the point. It is for the purpose of processing the data. The crime-related provisions in the Bill refer to the importance of identifying the purposes of the processing. Where the primary purpose is immigration related, it is not clear that we could rely on the crime-related exemptions. That is why paragraph 4 is in the schedule.
I am really sorry to have to say this, but that is utter nonsense. The idea that the Home Office will seek to regularise someone’s immigration status by denying them access to information that might support their case is, frankly, fanciful.
This is not a new debate; we last had it in 1983. The Home Office tried to sketch this exemption into legislation then, it failed, and we should not allow the exemption to go into the Bill, especially given that all the explanations we have heard this afternoon are about cases where paragraph 2(1)(a), or the safeguarding provisions drafted by the Government, would provide the necessary exemptions and safeguards in the contingencies that the Minister is concerned about.
I feel for the Under-Secretary, because she is on a bit of a sticky wicket given the Government’s drafting, but does my right hon. Friend agree that it is concerning that I asked twice to be pointed to specifics—I asked first how the pause is drafted in the Bill, and secondly where the word “immigration” appears under article 23 of the GDPR—but on neither occasion was I was pointed to them? We ought also to draw the Committee’s attention to the report on the Bill by the Joint Committee on Human Rights, which states:
“The GDPR does not expressly provide for immigration control as a legitimate ground for exemption.”
My hon. Friend is bang on the money, but perhaps the Under-Secretary can enlighten us.
All rights are reinstated once the risk to prejudice is removed. The wording is in line 35 of paragraph 4:
“to the extent that the application of those provisions would be likely to prejudice any of the matters mentioned in paragraphs (a) and (b).”
To reassure the hon. Member for Bristol North West, that is the end point.
I am grateful to the Under-Secretary for clarifying a point that was not at issue. No one is concerned about what rights kick back in at the end of a process. We are worried about how long the process will last, who will govern it, what rights newcomers to this country or courts will have to enforce some kind of constraint on the process and how we will stop the Home Office embarking on unending processes in a Jarndyce v. Jarndyce-like way, which we know is the way these cases are sometimes prosecuted. The Home Office is full of some of the most amazing civil servants on earth, but perhaps, a little like the Under-Secretary, they are sometimes good people trapped in bad systems and, dare I say it, bad arguments.
Question put, That the amendment be made.
I beg to move amendment 170, in schedule 2, page 151, line 8, at end insert—
“(f) in Chapter IX of the GDPR (provisions relating to specific processing situations), Article 89(1) (safeguards and derogations relating to processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes).”
This amendment adds the restrictions imposed on archiving by the GDPR and the Bill to the list of matters in the Bill that benefit from the Journalism, Art and Literature exemption.
The purpose of this amendment is to protect some of our important national archives. We in this country are some of the greatest collectors on earth; the tradition established by Sir Hans Sloane all those centuries ago inspired many generations that followed him. Our ability and our tradition of collecting mean that this country is now home to some of the greatest collections on the planet.
It is fantastic to see many of these institutions now rapidly digitalising those archives. I was privileged to be able to visit the Natural History Museum recently, which I think is home to something like 83 million different specimens. It is now beginning to digitalise those archives in a way that opens them up not only to our schoolchildren, but to citizens of this country and those around the world who are keen on science.
The point of this amendment is that we cannot simply preserve those archives in aspic. They must be dynamic resources; they must be added to, and our success or failure in that task has a crucial bearing on the health of our democracy and our ability to, dare I say it, reflect on past mistakes and do better. I think it was the legendary Karl Popper who once said, “To err is human, to correct divine.”
We make mistakes. It is important that we reflect on the mistakes we have made in the past, in order to do better next time around. Many of the more contemporary archives, particularly news archives, have had a crucial bearing on inquiries into historical child abuse, the injustices perpetrated at Hillsborough and at Orgreave, and HIV-contaminated blood. All those inquiries relied on records that were not necessarily historical; many were contemporary.
A range of crucial organisations entrusted with the delicate task of keeping our archives up to date are seriously worried about the provisions in the GDPR. In fact, they believe the inadequacy of the derogations and exemptions in the GDPR, as it is proposed that we draft it into law, means that they will be quickly put out of business. In particular, that will bite on thousands of smaller archives.
The point they have consistently made to us is that, although we have such great collections and archives in this country and a public interest culture around protecting some of those archives, we do not have any of the kind of legal protections that they enjoy in countries such as France. We do not have the defendable protections around archives that those abroad benefit from.
The challenge in this Bill is a lack of precision. I do not want to pretend that this is a black-and-white case. Sometimes news archives in particular will be required to draw something of a grey line, and I am afraid the Minister has to earn her pay and be the one to decide where to draw that grey line. Sometimes there will be information stored in those archives that absolutely should be subject to the GDPR provisions. But if we are in effect granting a carte blanche for people to make requests of archives that require those archives to dip deep into the historical record, correct things and go through challenging processes to ensure they are right, I am afraid it will put a number of our archives out of business, and that will damage the health of our democracy.
We have drafted this amendment with a number of aims. We want to try to create a statutory definition for organisations that archive in the public interest. We have had a first attempt at drawing that in a narrow way, so it does not infringe on material that is stored that absolutely should be subject to general GDPR provisions. We have done our best to ensure that the archiving exemptions are proportionate to the public interest nature of the material being archived. We wanted to offer an amendment worded hopefully in such a way that, frankly, it excludes Google, Facebook and others from enjoying the exemptions sought here.
This is the first place in the Bill where the debate rears its head. I am grateful to the range of museums, archives and the BBC that have helped us to craft this amendment. It should not be particularly controversial. There should be agreement across the Committee on the need to protect our great collections, yet keep some companies, such as Google and Facebook, subject to the provisions in the Bill.
We offer the amendment as a starter for 10. Obviously, we would be delighted if the Government accepted it; we would be even more pleased if they could perfect it.
I have just had a request to remove jackets, because of the warm temperature in the room. I give my permission to do so. I call the Minister.
Thank you, Mr Hanson. I agree with the tribute paid by the right hon. Member for Birmingham, Hodge Hill to the custodians of some of the most wonderful archives in the world. I will comment on his proposals with regard to such archives shortly, but I hope that recent debates have left no doubt in hon. Members’ minds that the Government are absolutely committed to preserving the freedom of the press, and maintaining the balance between privacy and freedom of expression in our existing law, which has served us well for so many years.
As set out in the Bill, media organisations can already process data for journalistic purposes, which includes media archiving. As such, we believe that amendment 170 is unnecessary and could be unhelpful. I agree with the right hon. Gentleman that it is crucial that the media can process data and maintain media archives. In the House of Lords, my noble Friend Lord Black of Brentwood explained very well the value of media archives. He said:
“Those records are not just the ‘first draft of history’; they often now comprise the only record of significant events, which will be essential to historians and others in future, and they must be protected.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 175.]
However, recital 153 indicates that processing for special purposes includes news archiving and press libraries. Paragraph 24 of schedule 2 sets out the range of derogations that apply to processing for journalistic purposes. That includes, for example, exemption from complying with requests for the right to be forgotten. That means that where the exemption applies, data subjects would not have grounds to request that data about them be deleted. It is irrelevant whether the data causes substantial damage or distress.
However, if media organisations are archiving data for other purposes—for example, in connection with subscriber data—it is only right that they are subjected to the safeguards set out in article 89(1), and the Bill provides for that accordingly. For that reason, I hope that the right hon. Gentleman agrees to reconsider his approach and withdraw his amendment.
I am happy to withdraw the amendment, although I would say to the Minister that the helpful words we have heard this afternoon will not go far enough to satisfy the objections that we heard from organisations. We reserve the right to come back to this matter on Report. We will obviously consult the organisations that helped us to draft the amendment, and I urge her to do the same. I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Schedule 2, as amended, agreed to.
Schedule 3
Exemptions etc from the GDPR: health, social work, education and child abuse data
Amendments made: 111, in schedule 3, page 160, line 21, leave out
“with the day on which”
and insert “when”.
This amendment is consequential on Amendment 71.
Amendment 112, in schedule 3, page 162, line 3, leave out paragraph 16 and insert—
“16 (1) This paragraph applies to a record of information which—
(a) is processed by or on behalf of the Board of Governors, proprietor or trustees of, or a teacher at, a school in Northern Ireland specified in sub-paragraph (3),
(b) relates to an individual who is or has been a pupil at the school, and
(c) originated from, or was supplied by or on behalf of, any of the persons specified in sub-paragraph (4).
(2) But this paragraph does not apply to information which is processed by a teacher solely for the teacher’s own use.
(3) The schools referred to in sub-paragraph (1)(a) are—
(a) a grant-aided school;
(b) an independent school.
(4) The persons referred to in sub-paragraph (1)(c) are—
(a) a teacher at the school;
(b) an employee of the Education Authority, other than a teacher at the school;
(c) an employee of the Council for Catholic Maintained Schools, other than a teacher at the school;
(d) the pupil to whom the record relates;
(e) a parent, as defined by Article 2(2) of the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).
(5) In this paragraph, “grant-aided school”, “independent school”, “proprietor” and “trustees” have the same meaning as in the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).”
This amendment expands the types of records that are “educational records” for the purposes of Part 4 of Schedule 3.
Amendment 113, in schedule 3, page 164, line 7, leave out
“with the day on which”
and insert “when”.—(Margot James.)
This amendment is consequential on Amendment 71.
Schedule 3, as amended, agreed to.
Schedule 4 agreed to.
Clause 16
Power to make further exemptions etc by regulations
Question proposed, That the clause stand part of the Bill.
This morning we had a discussion about some of the Henry VIII clauses contained in the Bill. In essence, I said that when we are talking about personal information—particularly, in such circumstances, sensitive personal information—there should be a strong presumption against Henry VIII clauses, with the onus being on the Government to justify why delegated legislation is the appropriate way to make changes to our data protection rules.
Throughout the passage of the Bill we will continue to challenge the Government to justify delegated powers proposed under the Bill. This clause is the next example of that arising, so in our view it falls on the Minister to explain why she seeks delegated authority to exercise certain functions under the GDPR. I look forward to hearing what she has to say.
We agree that the clause offers Ministers a rather sweeping power to introduce new regulations. Over the course of what has been quite a short day in Committee we have heard many reasons to be alarmed about equipping Ministers with such sweeping powers. We proposed an amendment to remove the clause, which I think was not selected because we have this stand part debate. What we need to hear from the Minister are some pretty good arguments as to why Ministers should be given unfettered power to introduce such regulations without the effective scrutiny and oversight of right hon. and hon. Members in this House.
I am glad that the right hon. Gentleman feels we have had a short day in Committee. In answer to his questions and those of the hon. Gentleman, the order making powers in clauses 16 and 113 allow the Secretary of State to keep the list of exemptions in schedules 2 to 4 and 11 up to date. As I mentioned when we discussed order making powers in relation to clause 10 and schedule 1, we carefully reviewed the use of such powers in the Bill following recommendations from the Delegated Powers and Regulatory Reform Committee. We think an appropriate balance has now been struck. It might be helpful if I explain the reasons for our thinking.
Clause 16 includes order making powers to ensure that the Secretary of State can update from time to time the particular circumstances in which data subjects’ rights can be disapplied. That might be necessary if, for example, the functions of a regulator are expanded and exemptions are required to ensure that those new functions cannot be prejudiced by a data subject exercising his or her right to object to the processing.
We believe it is very important that the power to update the schedules is retained. Several of the provisions in schedules 2 to 4 did not appear in the Data Protection Act 1998 and have been added to the Bill to address specific requirements that have arisen over the last 20 years.
For example, the regulatory landscape has changed dramatically since the 1998 Act. Organisations such as the Bank of England, the Financial Conduct Authority and the National Audit Office have taken on a far broader range of regulatory functions, and that is reflected in the various amendments we have tabled to paragraphs 7 to 9 of schedule 2, to provide for a broader range of exemptions. No doubt, there will be further changes to the regulatory landscape in the years to come. Of course, other exemptions in schedule 2 have been carried over from the 1998 Act, or indeed from secondary legislation made under that Act, with little change. That does not mean, however, that they will never need to be amended in the future. Provisions made under the 1998 Act could be amended via secondary legislation, so it would seem remiss not to afford ourselves that same degree of flexibility now. If we have to wait for primary legislation to make any changes, it could result in a delay of months or possibly years to narrow or widen an extension, even where a clear deficiency had been identified. We cannot predict the future, and it is important that we retain the power to update the schedules quickly when the need arises.
Importantly, any regulations made under either clause would be subject to the affirmative resolution procedure. There would be considerable parliamentary oversight before any changes could be made using these powers. Clause 179 requires the Secretary of State to consult with the Information Commissioner and other interested parties that he considers appropriate before any changes are made.
I hope that that reassures Members that we have considered the issue carefully. I commend clause 16 to the Committee.
Question put, That the clause stand part of the Bill.
The Committee proceeded to a Division.
(6 years, 8 months ago)
Westminster HallWestminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.
Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.
This information is provided by Parallel Parliament and does not comprise part of the offical record
It is a privilege to serve under your chairmanship, Mr Hollobone. Not so long ago, the Government invited us to believe that it was possible to cut crime and cut the police at the same time. Over the last couple of years the idiocy of that idea has been exposed for all to see. The truth is that crime—violent crime in particular—is now rising, and on the streets of my constituency there is real concern about the growth of dealing in drugs out on the streets, often in broad daylight. When people report that problem, the police simply do not have the resources to respond in the way that the community wants and expects.
In the west midlands, as I know from my constituency, we are blessed with some of the greatest police officers in the business. It was five years ago that I had to go and give thanks to PC Adam Koch, who had literally thrown himself onto a knifeman in one of our mosques in Ward End. He put his life on the line to protect the lives of the worshippers in that mosque. Today, we have great police officers such as Sergeant Hanif, who leads an extraordinary team across east Birmingham, cracking down on drugs and drug dealing, seizing the proceeds of crime and taking firearms off the streets at every opportunity. The relationship of trust that he has built with the community has transformed the amount of intelligence coming in to the police and the effectiveness of the police in response.
What great police officers such as Sergeant Hanif and PC Adam Koch need is a Government who are on their side, rather than a Government who are determined to cut their service to ribbons. As my hon. Friend the Member for Birmingham, Erdington (Jack Dromey) so eloquently put it, West Midlands police is now the smallest it has been since it was created in 1974. It has suffered real-terms cuts of something like £145 million. The idea that somehow different decisions on the precept could have corrected a cut on that scale is frankly fanciful. Given the rise in crime that we have in the west midlands, and the fact that we are one of the most dangerous hotspots for counter-terrorism policing in the country, it beggars belief when we put that risk of harm alongside the cuts we have had, which are so different from the financial settlements that other police forces have enjoyed.
Will the right hon. Gentleman give way?
I will happily give way; perhaps my close neighbour can tell me how it is that Hampshire can enjoy a different settlement from the West Midlands police force when we have a threat assessment that is so very different.
I just want to be clear: the right hon. Gentleman refers, quite rightly, to the fact that the west midlands is a hotspot for some of the specialist terror policing, but will he also acknowledge that the Government have, quite separately, given significant increases of funds for that very purpose?
There has been a provision for counter-terrorism policing, but, as the right hon. Gentleman knows better than I do, neighbourhood policing is the frontline of the fight against terrorism in this country. The stronger the frontline, the safer we are. In the west midlands, our frontline is being cut to shreds.
My right hon. Friend will notice that in an intervention earlier I mentioned Willenhall in particular, where there have been public meetings. It is strange when we talk about fighting terrorism that there is a police station in that area in which high-profile prisoners are kept. I wonder where in the west midlands they will put them if there are any further arrests.
Exactly. Those threats are now multiplying across the region.
I respect the task that the Police Minister has to try to perform. He has taken the time to listen to representations from west midlands MPs of all political stripes. I am afraid that he was not backed up by either the Prime Minister or the Chancellor; they did not give the Home Office in general, and him in particular, the financial settlement that we needed in order to safeguard our communities. For us in Hodge Hill, that means that we now have the proposed closure of the Shard End police base—something that both Councillor Ian Ward and I disagree with.
We need a police base in Shard End, because—as was explained to me during my own glorious fortnight as the Minister for police and counter-terrorism, before I went on to serve a further two years as a Home Office Minister—neighbourhood policing creates a different kind of relationship between the police service and the community. It unlocks a level of trust, intelligence and insight that makes it much easier to crack down on crime. When we shut down police bases, we weaken the frontline in that fight. I do not want to see crime, drug dealing and violent crime rise any further. That is why I call on the Minister today to fix the problem in the West Midlands Police finances, give us the money we deserve and let our brave men and women of the West Midlands Police service get on with the job they are so dedicated to doing.
(7 years, 4 months ago)
Commons ChamberMay I press the Minister on the answer she gave to the former Chair of the Home Affairs Committee, my right hon. Friend the Member for Leicester East (Keith Vaz)? Social media giants remain the command and control platform of choice for extremists. I wrote to the Home Secretary on 29 March to ask whether she was considering similar laws to those in Germany and in Ireland, where a new watchdog is being created to police social media giants, or indeed proposals similar to those in the US Senate, such as the Feinstein Bill, which would require social media giants to report terrorism content. Governments around the world are taking action; when will this Government follow suit?
I can assure the right hon. Gentleman that the Government are taking action by leading the international efforts to make sure that internet platforms take their responsibilities seriously. The Home Secretary has made it absolutely clear that nothing is off the table. We are considering all options to make sure that the vile ideology and hatred that is pumped around the internet is stopped as soon as possible.
(7 years, 10 months ago)
Commons ChamberI appreciate my right hon. and learned Friend’s point. One piece of work we will do during the negotiations is to ensure that we get something bespoke for the United Kingdom. One temptation is to look at what other countries have done. As I mentioned earlier, there are countries who work with Europol—the United States is a good example—that are not members of the EU and have found ways to make it work. We can look at those examples, but we actually need to develop a bespoke solution for the United Kingdom.
I just want to make a bit more progress.
The Prime Minister set out in her speech yesterday the Government’s negotiation objectives for Brexit, explaining that this Government plan to make Britain “stronger” and “fairer”, restoring “national self-determination” while becoming
“more global and internationalist in action and in spirit.”
We have a long record of playing a leading role, within Europe and globally, to support and drive co-operation to help to protect citizens and defend democratic values, and we have been leading proponents of the development of a number of the law enforcement and criminal justice measures that are now in place across the European Union. The Prime Minister reiterated yesterday that although June’s referendum was a vote to leave the EU, it was not a vote to leave Europe. We want to continue to be reliable partners, willing allies and close friends with the European countries.
On a practical level, there has been no immediate change to how we work with the EU following the referendum, as the recent decision just before Christmas to seek to opt into the new legislation framework for Europol, the EU policing agency, demonstrates. The UK will remain a member of the EU with all the rights and obligations that membership entails until we leave. The way in which we work with the EU, of course, will have to change once we leave and we must now plan for what our new relationship will look like. The views that hon. Members express here today will be helpful in that regard, including, no doubt, that of the right hon. Member for Birmingham, Hodge Hill (Liam Byrne).
I just want to follow up on the incredibly important question posed by the right hon. and learned Member for Beaconsfield (Mr Grieve). We are the proud authors of human rights in Europe. It is a tradition that dates back to Magna Carta. Will the Minister confirm that when the Government bring forward their proposals on a British Bill of Rights, nothing in the draft for discussion will propose that we leave the European convention on human rights or the European Court of Human Rights?
The right hon. Gentleman tempts me to give a running commentary and to prejudge the outcome of the negotiations and work in the couple of years ahead, but I will resist. However, I will say that while we remain a member of the EU we recognise the jurisdiction of the European Court of Justice over the measures that we have opted into. It is too early to speculate on exactly what our relationship with the European Court of Justice will be after we leave the EU. That work will be done as we go forward.
I have already spoken to several counterparts in Europe, as have the Home Secretary and many of my colleagues across Government. In my conversations with colleagues across Europe, I have been encouraged by their view that it is essential to find a way for our shared work on security to continue, but we do have questions about how that should happen in practice and we need to work through answering them. This will be complex and subject to negotiation. We are committed to finding a way forward that works for the UK and the European Union. The Home Office is working with Departments—such as that of the Minister of State, Department for Exiting the European Union, my right hon. Friend the Member for Clwyd West (Mr Jones), who will be closing the debate—across Whitehall to analyse the full range of options for future co-operation.
We are liaising closely with our colleagues in the devolved Administrations as it is crucial to ensure that we find a way forward that works for all of the UK. We are drawing on the invaluable frontline experience of operational partners such as the National Crime Agency and the Crown Prosecution Service, and I am grateful for the ongoing contributions of all those organisations. The work is being drawn together with the support of our colleagues in the Department for Exiting the European Union and will form part of our wider exit negotiation strategy.
(8 years ago)
Commons ChamberAs the Prime Minister has said, we wish to protect the status of EU citizens working here. At the same time, of course, we expect the status of British citizens living and working elsewhere to be respected as well.
Ten days ago, Allan Richards was convicted in Birmingham of the most horrific catalogue of offences against children, some as young as eight. I congratulate West Midlands police on the forensic investigation that brought him to justice, but he was a serving police officer for more than 30 years. Will the Home Secretary assure the House that the inquiry into what happened will be independent, that whistleblowers will be given protection and that, if other agencies, including the Crown Prosecution Service, made mistakes, they will form part of the investigation?
The Independent Police Complaints Commission will take on this hugely important case which, by definition, will be an independent investigation. I reassure the right hon. Gentleman that the Policing and Crime Bill will go further by giving even more protection to whistleblowers and more powers to the IPCC to take on and lead such cases without the need for the involvement of, or a recommendation from, the police in the first place. I am happy to write to the right hon. Gentleman with more detail.