(7 years, 1 month ago)
Lords ChamberMy Lords, this is a relatively narrow point and affects only a very small part of the Bill, but is still quite important. The amendments in the group mainly cover the question of how the Bill can reach out to the question about anonymisation and how, or not, it plays against de-identification. There are two amendments and a clause stand part Motion which relate to other slightly different issues, which we will get to in turn.
Amendment 170CA would insert into the Bill the term “anonymisation”, as there is no definition of de-identification in the Bill. I will come back to explain what that means in practice. Amendment 170CB provides an important exemption for data scientists and information security specialists dealing with a particular area, because there is a fear that the introduction of criminal sanctions might mean that they would be caught when they are trying to consider the issue for scientific and other reasons. Amendment 170CC adds a definition of identified data—after all, if it is to be criminalised, there needs to be a definition. This definition will cover cases which involve names of individuals, but will also cover those where fingerprints, for instance, are used to identify people.
The clause creates a new offence of knowingly or recklessly re-identifying information that has been de-identified without the consent of the controller. Amendment 170F asks for guidance relating to this offence. It is at the request of the Royal Society, because it wants clarity on the legal basis for processing.
Amendment 170G concerns transparency. If we are going to go into this area, it is very important that we know more about what is happening. The amendment suggests that the Information Commissioner,
“must set standards by which a data controller is required to anonymise personal data”.
There may be lots of new technologies soon to be invented or already available, and it is important that the way in which this important work goes forward can be flexed as and when new technologies come forward. We think that the Information Commissioner is in the strongest position to do that.
The other set of amendments to which our names are attached, Amendments 170E and 170H, relate to particular problems that can arise in large databases within health. There is a worry that where re-identification occurs by accident or just through the process of using the data, an offence will be created. MedConfidential suggests that some form of academic peer reviewing might be useful in trying to assess whether this was a deliberate act or just an unfortunate consequence of the work being done by those looking at the dataset concerned. The further amendment, Amendment 170H, clarifies whether an offence actually occurs when the re-identification work applies to disseminated NHS data —which of course, by its very nature, is often rather scattered and difficult to bring together. There is a particular reason for that, which we could go into.
At the heart of what I just said is a worry that certain academics have communicated to us: that the Bill is attempting to address what is in fact a fundamental mathematical problem—that there is no real way of making re-identification illegal—with a legal solution, and that this approach will have limited impact on the main privacy risks for UK citizens. If you do not define de-identification, the problem is compounded. The reference I have already made suggests that there might be advantage to the Bill if it used the terms used in the GDPR, which are anonymisation and pseudonymisation.
The irony which underlies the passion with which we have received submissions on this is that the people likely to be most affected by this part of the Bill are UK information security researchers, one of our academic strengths. It seems ironic that we should be putting into the Bill a specific criminal penalty which would stop them doing their work. Their appeal to us, which I hope will not fall on stony ground, is that we should look at this again. This is not to say in any sense that it is not an important issue, given the subsequent pain and worry that happens when datasets certified as anonymised are suddenly revealed as capable of being cracked, so people can pick up not just details of information about dates of birth or addresses but much more important stuff to do with medical health. So it is very important—and others may want to speak to the risk that it poses also to children, in particular. I hope that that is something that we might pick up.
There needs to be a proper definition in the Bill, whatever else we do about it, and that would be right in a sense. But we would like transparency about what is happening in this area, so that there is more certainty than at present about what exactly is meant by anonymous data and whether it can be achieved. That could be solved if the Information Commissioner is given responsibility for doing it. I beg to move.
We are in the thickets here at the interface between technology, techno-speak and legality. Picking our way through Clause 162 is going to be rather important.
There are two schools of thought. The first is that we can amend this clause in fairly radical ways—and I support many of the amendments proposed by the noble Lord, Lord Stevenson. Of course, I am speaking to Amendment 170E as well, which tries to simplify the language and make it much more straightforward in terms of retroactive approval for actions taken in this respect, and I very much hope that parliamentary draftsmen will approve of our efforts to simplify the language. However, another more drastic school of thought is represented by many researchers—and the noble Lord, Lord Stevenson, has put the case very well that they have put to us, that the cause of security research will be considerably hampered. But it is not just the research community that is concerned, although it is extremely concerned by the lack of definition, the sanctions and the restrictions that the provisions appear to place on their activities. Business is also concerned, as numerous industry practices might be considered illegal and a criminal offence, including browser fingerprinting, data linkage in medicine, what they call device reconciliation or offline purchases tracking. So there is a lot of uncertainty for business as well as for the academic research community.
This is where we get into the techno-language. We are advised that modern, privacy-enhancing technologies such as differential privacy, homomorphic encryption—I am sure that the Minister is highly familiar with that—and question and answer systems are being used and further developed. There is nothing worse than putting a chill on the kind of research that we want to see by not acknowledging that there is the technology to make sure that we can do what we need to do and can keep our consumers safe in the circumstances. The fact is that quite often anonymisation, as we are advised, can never be complete. It is only by using this new technology that we can do that. I very much hope that the Minister is taking the very best legal and technology advice in the drafting and purposes of this clause. I am sure that he is fully aware that there is a great deal of concern about it.
I rise to support the noble Lords, Lord Stevenson and Lord Clement-Jones, and some of the amendments in this group on this, the final day in Committee. I congratulate my noble friends Lord Ashton and Lady Chisholm of Owlpen as well as the indefatigable Bill team for taking this gargantuan Bill through so rapidly.
The problem caused by criminalising re-identification was brought to my attention by one of our most distinguished universities and research bodies, Imperial College London. I thought that this was a research issue, which troubled me but which I thought might be easy to deal with. However, talking to the professor in the computational privacy group today, I found, as the noble Lord, Lord Clement-Jones, said, that it goes wider and could cause problems for companies as well. That leads me to think that I should probably draw attention to my relevant interests in the House of Lords register of interests.
The computational privacy group explained that the curious addition of Clause 162—which is different in character and language from other parts of the Bill, as the noble Lord, Lord Stevenson, said—draws on Australian experience, but risks halving the work of the privacy group, which is an academic body, and possibly creating costs and problems for other organisations and companies. I am not yet convinced that we should proceed with this clause at all, for two reasons. First, it will not address the real risk of unethical practice by people outside the UK. As the provision is not in the GDPR or equivalent frameworks in most other countries, only UK and Australian bodies or companies will be affected, which could lead to the migration of research teams and data entrepreneurs to Harvard, Paris and other sunny and sultry climes. Secondly, because it will become criminal in the UK to re-identify de-identified data—it is like saying “seashells on the seashore”—the clause could perversely increase the risk of data being re-identified and misused. It will limit the ability of researchers to show up the vulnerability of published datasets, which will make life easier for hackers and fraudsters—another perversity. For that reason, it may be wise to recognise the scope and value of modern privacy-enhancing technologies in ensuring the anonymous use of data somewhere in the Bill, which could perhaps be looked at.
I acknowledge that there are defences in Clause 162 —so, if a person faces prosecution, they have a defence. However, in my experience, responsible organisations do not much like to rely on defences when they are criminal prohibitions, as they can be open to dispute. I am also grateful to the noble Lord, Lord Stevenson— I am so sorry about his voice, although it seems to be getting a bit better—for proposing an exemption in cases where re-identification relates to demonstrating how personal data can be re-identified or is vulnerable to attack. However, I am not sure that the clause and its wider ramifications have been thought through. I am a strong supporter of regulation to deal with proven harm, especially in the data and digital area, where we are still learning about the externalities. But it needs to be reasonable, balanced, costed, careful and thought through—and time needs to be taken for that purpose.
I very much hope that my noble friend the Minister can find a way through these problems but, if that is not possible, I believe that the Government should consider withdrawing the clause.
My Lords, I am grateful to all noble Lords who have spoken on this very important clause. I agree very much with the noble Lords, Lord Clement-Jones and Lord Stevenson, that these are important issues which we need to consider. The amendments seek to amend Clause 162, which introduces the offence of re-identifying data that has been de-identified. I will start by giving some background to this clause because, as noble Lords have mentioned, this is new to data protection legislation.
Pseudonymisation of datasets is increasingly commonplace in many organisations, both large and small. This is a very welcome development: where sensitive personal data is being processed in computerised files, it is important that people know that data controllers are taking cybersecurity seriously and that their records are kept confidential. Article 32 of the GDPR actively encourages controllers to implement technical and organisational measures to ensure an appropriate level of security, including, for example, through the pseudonymisation and encryption of personal data.
As noble Lords will be aware, the rapid advancement of technology has opened many doors for innovation in these sectors. However, as we continue to be able to do more with technology, the risk of its misuse also grows. Online data hackers and scammers are a much more prominent and substantial threat than was posed in 1998, when the original Data Protection Act was passed. It is appropriate, therefore, that the Bill addresses the contemporary challenges posed by today’s digital world. This clause responds to concerns raised by the National Data Guardian for Health and Care and other stakeholders regarding the security of data kept in online files, and is particularly timely following the well-documented cyberattacks on public and private businesses over the last few years.
It is important to note that the Bill recognises that there might be legitimate reasons for re-identifying data without the consent of the controller who encrypted it. The clause includes a certain number of defences, as my noble friend Lady Neville-Rolfe mentioned. These can be relied on in certain circumstances, such as where re-identification is necessary for the purpose of preventing or detecting crime, to comply with a legal obligation or is otherwise necessary in the public interest. I am aware that some academic circles, including Imperial College London, have raised concerns that this clause will prohibit researchers testing the robustness of data security systems. However, I can confidently reassure noble Lords that, if such research is carried out with the consent of the controller or the data subjects, no offence is committed. Even if the research is for legitimate purposes but carried out without the consent of the controller who de-identified the data in the first place, as long as researchers act quickly and responsibly to notify the controller, or the Information Commissioner, of the breach, they will be able to rely on the public interest defences in Clause 162. Finally, it is only an offence to knowingly or recklessly re-identify data, not to accidentally re-identify it. Clause 162(1) is clear on that point.
I turn to the specific amendments that have been tabled in this group. Amendment 170CA seeks to change the wording in line 3 from “de-identified” to “anonymised”. The current clause provides a definition of de-identification which draws upon the definition of “pseudonymisation” in article 4 of the GDPR. We see no obvious benefit in switching to “anonymised”. Indeed, it may be actively more confusing, given that recital 26 appears to use the term “anonymous information” to refer to information that cannot be re-identified, whereas here we are talking about data that can be—and, indeed, has been—re-identified.
Amendment 170CB seeks to provide an exemption for re-identification for the purpose of demonstrating how the personal data can be re-identified or is vulnerable to attacks. The Bill currently provides a defence for re-identification where the activity was consented to, was necessary for the purpose of preventing or detecting crime, was justified as being in the public interest, or where the person charged reasonably believes the activity was, or would have been, consented to. So long as those re-identifying datasets can prove that their actions satisfy any of these conditions, they will not be guilty of an offence. In addition, we need to be careful here not to create defences so wide that they become open to abuse.
Amendment 170CC seeks to add to the definition of what constitutes re-identification. I agree with the noble Lord that current techniques for re-identification involve linking datasets. However, we risk making the offence too prescriptive if we start defining exactly how re-identification will occur. As noble Lords, including the noble Lord, Lord Clement-Jones, mentioned, as technology evolves and offenders find new ways to re-identify personal data, we want the offence to keep pace.
Amendment 170E seeks to add an extra defence for persons who achieve approval for re-identifying de-identified personal data after the re-identification has taken place. The current clause provides a defence where a person acted in the reasonable belief that they would have had the consent of the controller or the data subject had they known about the circumstances of the re-identification. Retroactive approval for the re-identification could be relied on as evidence in support of that defence, so we believe that there is no need to provide a separate defence for retroactive consent.
My Lords, I think that the noble Lord is misreading the amendment. As I read my own amendment, I thought it was substitutional.
If we are talking about Amendment 170E, I am certainly prepared to look at that and address it.
That may have been the original intention, but perhaps it was never put properly into effect.
In which case, I will read Hansard, the noble Lord can do so and I am sure we will come to an arrangement. We can talk about that, if necessary.
Amendment 170F seeks to require the commissioner to produce a code of practice for the re-identification offence three months after Royal Assent. We can certainly explore with the commissioner what guidance is planned for this area and I would be happy to provide noble Lords with an update on that in due course. However, I would not like to tie the commissioner to providing guidance by a specific date on the face of the Bill. It is also worth mentioning here that, as we discussed on a previous day in Committee, the Secretary of State may by regulation require the commissioner to prepare additional codes of practice for the processing of personal data under Clause 124 and, given the issues that have been raised, we can certainly bear those powers in mind.
Finally, Amendments 170G and 170H would oblige the commissioner to set standards by which the controller is required to anonymise personal data and criminalise organisations which do not comply. I reassure noble Lords that much of this work is under way already and that the Information Commissioner’s Office has been working closely with government, data controllers and the National Cyber Security Centre to raise awareness about improving cybersecurity, including through the use of pseudonymisation of personal data.
It is important to point out that there is no weakening of the provisions contained in article 5 of the GDPR, which require organisations to ensure appropriate security of personal data. Failure to do so can, and will, be addressed by the Information Commissioner, including through the use of administrative penalties. Some have said that criminalising malicious re-identification would create complacency among data controllers. However, they still have every incentive to maintain security of their data. Theft is a criminal offence but I still lock my door at night. In addition, I am not convinced by the mechanism the noble Lord has chosen. In particular, criminalising failure to rely on guidance would risk uncertainty and unfairness, particularly if the guidance was wrong in law in any respect.
I accept that the issues noble Lords have raised are important but I hope that, in view of these reassurances, the amendment will be withdrawn, and that the House will accept that Clause 162 should stand part of the Bill. There are reasons for wanting to bring in this measure, and I can summarise them. These were recommendations in the review of data security, consent and opt-outs by the National Data Guardian, who called for the Government to introduce stronger sanctions to protect de-identified patient data. People are generally more willing to participate in medical research projects if they know that their data will be pseudonymised and held securely, and the Wellcome Trust, for example, is supportive of the clause. I hope that those reassurances will allow the noble Lord to withdraw his amendment and enable the clause to stand part of the Bill.
My Lords, in moving Amendment 183A I hope to astonish the Minister with my brevity. Clause 172 deals with the avoidance of certain contractual terms related to health records so that,
“A term or condition of a contract is void in so far as it purports to require an individual to supply another person with a record which — … (a) consists of the information contained in a health record, and … (b) has been or is to be obtained by a data subject in the exercise of a data subject access right”.
The NHS has committed to informing patients how their medical records are used. The legal protections in the Bill against an enforced subject access request on a medical record should also apply to such information about that record. Does this provide the required protection? I beg to move.
It is probably for the best that we are not doing a seventh day in Committee because the noble Lord, Lord Stevenson, has told us that his voice is going and I seem to have an infected eye. Slowly, we are falling by the way, so it is probably just as well that this is our last evening.
This amendment seeks to amend Clause 172, which concerns contractual terms relating to health records. As noble Lords are aware, the Bill will give people more control over use of their data, providing stronger access rights as well as new rights to move or delete personal data. Data subject access rights are intended to aid people in getting access to information held about them by organisations. While subject access provisions are present in current data protection law, the process will be simplified and streamlined under the new legal framework, reflecting the importance of data protection in today’s digital age.
There are, unfortunately, a minority of instances where service providers and employers seek to exploit the rights of data subjects, making it a condition of a contract that a person supplies to them health records obtained through use of their data subject access rights. It is with this in mind that Clause 172 was drafted, to protect data subjects from abuses of their rights. Organisations are able to use provisions in the Access to Medical Reports Act 1988 to gain access to a person’s health records for employment or insurance purposes, and so should not be unduly relying upon subject access rights to acquire such information.
Amendment 183A seeks to widen the clause to include prohibiting contractual terms from including a requirement to use subject access rights to supply a person with information “associated with” as well as “in” a health record. While I can see where the noble Lord is coming from with the amendment and appreciate the willingness further to protect data subjects from exploitation, we are not convinced that it is necessary to widen the scope of this clause. The Government believe that avoidance of contractual terms—that is to say a restriction on parties’ freedom of contract—is not something that should legislated for lightly. Our starting point must be that contractual terms are voided only where there is a known, rather than a hypothetical, abuse of them.
It is also important to point out that the clause has been carried over from the 1998 Act, which has served us well for many years and we are not aware of any issues with its scope. But I will certainly carefully read the noble Lord’s contribution in Hansard, and with this in mind I encourage the noble Lord to withdraw his amendment.
My Lords, I thank the Minister. She will not need to spend very long reading my contribution in Hansard, as she will appreciate, but I pledge to read what she had to say. The interplay with the Access to Medical Reports Act may be of some importance in this, but on both sides we may need to reflect a little further. The case being made is that, because the NHS is making more information available about the use of patient records, it may be appropriate to change the legislation, which, as the Minister said, may have been fit for purpose for a period of time but now, in the light of new circumstances, may need changing. Indeed, it may not be “hypothetical” any more, to use her word. I will reflect on what the Minister said, but if there is scope for further improvement of the clause, I hope that it might be considered at a future stage. In the meantime, I beg leave to withdraw the amendment.
My Lords, at earlier stages of the Bill, the Minister and others have been at pains to stress the need to ensure that, whatever we finally do, the Bill should help to build trust between those who operate and accept data and those who provide it—the data subjects. It is important that we look at all aspects of that trust relationship and think about what we can do to make sure that it fructifies. Amendment 184 tries to add to the Bill something that could be there, because it is provided for in the GDPR, but is not there. Will the Minister explain when he responds why article 80(2) of the GDPR is not translated into UK legislation, as could happen? The proposed new clause would provide that,
“a body or other organisation which meets the conditions set out in that Article has the right to lodge a complaint, or exercise the rights, independently of a data subject’s mandate”.
I will largely leave the noble Lord, Lord Clement-Jones, to introduce Amendment 185 because he has a new and brief style of introduction, which we like a lot.
It is certainly new to me. He may have been here a lot longer than I have and there have been other occasions where he has been less than fulsome in his contributions. But I am not in any sense criticising him because everything he says has fantastic precision and clarity, as befits a mere solicitor. It is important that we give him the chance to shine on this particular issue as well.
I mentioned what a pleasure it is to have the noble Baroness, Lady Neville-Rolfe, here today, particularly because she will speak very well to the fact that only a few happy months ago we worked on the Consumer Rights Bill, which is now an Act, in which a power was given to private enforcers to take civil action in courts to protect collective consumer rights via an enforcement order. The campaigning consumer body Which? is the designated private enforcer.
Also, in the financial sector, Which?, Citizens Advice, the Federation of Small Businesses and the Consumer Council for Northern Ireland have the power to present super-complaints to the FCA. The super-complainant system is working very well; one reason why the PPI mis-selling scandal was discovered was as a result of the work of Citizens Advice. These independent enforcers of consumer rights in the traditional consumer sector and in the consumer finance sector exist. Why is there no equivalent status for digital consumer enforcers? That is the question raised by the amendment.
The powers for independent action here are important in themselves and I am sure other noble Lords will speak to that point, but they are also really important at the start of this new regime we are bringing in. With the new Data Protection Bill we have a different arrangement. Far more people are involved and a lot more people are having to think harder about how their data is being used. It makes absolute sense to have a system that does not require too much knowledge or detail, which was aided and abetted by experts who had experience in this, such as Which? and others, and would allow those who are a little fazed by the whole process of trying to raise an action and get things going to have a steady hand that they know will take it on behind them.
The Government will probably argue that by implementing article 80(1) of the GDPR they are providing effectively the same service. That is a system under which an individual can have their case taken up by much the same bodies as would be available under article 80(2). However, when an individual complainant is working with a body such as Which?, we are probably talking about redress of the individual whose rights have been breached in some way and exacting from the company or companies concerned a penalty or some sort of remuneration. One can see in that sense that the linking between the individual and the body that might take that on is important and would be very helpful.
However, there are cases—recent ones come to mind such as TalkTalk, Equifax, Cash Converters and Uber—where data has gone missing and there has been a real worry about what information has escaped and is available out there. I do not think that in those cases we are talking about people wanting redress. What they want is action, such as making sure that their credit ratings are not affected by their data having come out and that they could perhaps get out of contracts. One of the issues that was raised with EE and TalkTalk was that people had lost confidence in the companies and wanted to be able to get out of their contracts. That is not a monetary penalty but a different form of arrangement. In some senses, just ongoing monitoring of the company with which one’s data is lodged might be a process. All that plays to a need to have in law in Britain the article 80(2) version of what is in the GDPR. I beg to move.
My Lords, I strongly support Amendment 184. The Minister will have noticed that Amendment 185 would simply import the same provisions into applied GDPR for this purpose. The rationale, which has been very well put forward by the noble Lord, Lord Stevenson, is precisely the same.
I do not know whether the Minister was choking over his breakfast this morning, but if he was reading the Daily Telegraph—he shakes his head. I am encouraged that he was not reading the Daily Telegraph, but he would have seen that a letter was written to his right honourable friend Matt Hancock, the Digital Minister, demanding that the legislation can and should contain the second limb that is contained in the GDPR but is not brought into the Bill. The letter was signed by Which?, Age UK, Privacy International and the Open Rights Group for all the reasons that the noble Lord, Lord Stevenson, put forward. The noble Lord mentioned a number of data breach cases, but the Uber breach came to light only last night. It was particularly egregious because Uber did not tell anybody about it for months and, as far as one can make out from the press reports, it was a pay-off. There is a very important role for such organisations to play on behalf of vulnerable consumers.
The Which? survey was particularly important in that respect because it showed that consumers have little understanding of the kind of redress that they may have following a data breach. A recent survey shows that almost one in five consumers say that they would not know how to claim redress for a data breach, and the same proportion do not know who would be responsible for helping them when data is lost. Therefore the equivalent of a super-complaint in these circumstances is very important. To add to that point, young people are often the target of advertising and analysis using their personal data. I think they would benefit particularly from having this kind of super-complaint process for a data breach.
I hope very much that the Government, who I believe are conducting some kind of review, although it is not entirely clear, will think about this again because it is definitely something we will need to bring back on Report.
My Lords, I support Amendment 184. As the noble Lord, Lord Stevenson, said, the GDPR does allow not-for-profit organisations to lodge complaints about suspected breaches of data protection without needing the authorisation of the individuals concerned. I really do not understand why this has been taken out; it is such an important piece of legislation that gives teeth to data protection. Most people do not have the time or the inclination to lodge complaints against data controllers. So many organisations are now holding data about us that it is ridiculous to suggest that individuals can become data detectives responsible for finding out who holds data on them and trying to work out whether that data is being processed in accordance with data protection rules.
I went through the hassle of getting my own subject access request from the Met police. It took a lot of form filling and cost me £10, which was absolutely not money well spent because the file, when I got it, was so redacted. I did ask for my money back but was not given it. That shows me that most of us will not know that data about us is being held—so the amendment is extremely valid.
Despite my opposition to some provisions in the Bill, I accept that it is very important. However, it is equally important that we get it right and that we do not have all these derogations which mean that it has less authority and power. Personally, I think that the amendment strengthens the data protection regime without any hassle for consumers. I hope that the Government will include it in the next iteration of the Bill.
My Lords, I am grateful to all noble Lords who have contributed—in particular my noble friend Lord Lucas, who was even briefer than the noble Lord, Lord Clement-Jones. He made his point very succinctly and well.
With the greatest respect to the noble Lords, Lord Stevenson and Lord Clement-Jones—and I do mean that sincerely—during the passage of the 443 amendments in Committee that we are rapidly approaching the end of, we have listened carefully to each other, but in this case I am afraid that we reject Amendments 184 and 185 as being unnecessary. We believe that they are not required because the Bill already provides sufficient recourse for data subjects by allowing them to give consent to a non-profit organisation to represent their interests.
Clause 173, in conjunction with article 80(1) of the GDPR, provides data subjects with the right to authorise a non-profit organisation which has statutory objectives in the public interest and which is active in the field of data protection to exercise the rights described in Clauses 156 to 160 of the Bill. Taken together with existing provision for collective redress, and the ability of individuals and organisations to independently complain to the Information Commissioner where they have concerns, groups of like-minded data subjects will have a variety of redress mechanisms from which to choose. It is not true that when we have large numbers of data subjects they are unable, or too ignorant of their rights, to combine. For example, it is worth noting that more than 5,000 data subjects have brought one such action which is currently proceeding through the courts.
Furthermore, we would argue that the amendment is premature. If we were to make provision for article 80(2), it would be imperative to analyse the effectiveness not only of Clause 173 and article 80(1) of the GDPR but of other similar provisions in UK law to ensure that they are operating in the interests of data subjects and not third parties. We would also need to assess, for example, how effective the existing law has been in dealing with issues such as aggregate damages, which cases brought under article 80(2) might be subject to.
More generally, the Bill seeks to empower data subjects and ensure that they receive the information they need to enforce their own rights, with assistance from non-profit organisations if they wish. The solution to a perceived lack of data subject engagement cannot be to cut them out of the enforcement process as well. Indeed, there is a real irony here. Let us consider briefly a claim against a controller who should have sought, but failed to get, proper consent for their processing. Are noble Lords really suggesting that an unrelated third party should be able to enforce a claim for not having sought consent without first seeking that same consent?
We should also remember that these not-for-profit organisations are active in the field of data subjects’ rights; indeed, the GDPR states that they have to be. While many—the noble Lord, Lord Clement-Jones, mentioned Which?—will no doubt have data subjects’ true interests at heart and will be acting in those best interests, others will have a professional interest in achieving a different outcome: raising their own profile, for example.
I know that these amendments are well intentioned and I do have some sympathy with the ambition of facilitating greater private enforcement to complement the work of the Information Commissioner. But, for the reasons I have set out, I am not convinced that they are the right solution to the problems identified by noble Lords, and I therefore urge the noble Lord to withdraw his amendment.
My Lords, I am baffled by the Minister’s response. The Government have taken on board huge swathes of the GDPR; in fact, they extol the virtues of the GDPR, which is coming into effect, as are many of its articles. Yet they are baulking at a very clear statement in article 80(2), which could not be clearer. Their prevarication is extravagant.
The noble Lord will admit that the GDPR allows member states to do that; otherwise, it would have been made compulsory in the GDPR. The derogations are there to allow member states to decide whether or not to do it.
To summarise, we have chosen not to adopt article 80(2) because the Bill is based on the premise of getting consent—but these amendments are saying that, regardless of what the data subject wants or whether they have given consent, other organisations should be able to act on their behalf without their consent. That is the Government’s position and I hope that noble Lords will feel able not to press their amendments.
My Lords, government Amendments 185A, 185B, 185C and 185D add four fairly substantial new clauses to the Bill on the last day of Committee. I can see the point made by the Minister when he moved the amendments, but it is disappointing that they were not included right at the start. Have the Government just thought about them as a good thing?
The Delegated Powers and Regulatory Reform Committee has not had time to look at these matters. I note that in Amendment 185A, the Government suggest that regulations be approved by Parliament under the negative procedure. I will look very carefully at anything that the committee wants to bring to the attention of the House when we look at these matters again on Report. I am sure the committee will have reported by then.
I will not oppose the amendments today, but that is not to say that I will not move some amendments on Report—particularly if the committee draws these matters to the House’s attention.
My Lords, I want to echo that point. There is time for reflection on this set of amendments and I sympathise with what the noble Lord, Lord Kennedy, said.
My Lords, I am grateful for those comments. We understand that the DPRRC will have to look at the powers under the clause. As usual, as we have done already, we take great note of what the committee says; no doubt it will opine soon. We will pay attention to that.
My Lords, looking at the amendments and new Schedule 18 is rather like looking for a needle in a haystack, but I hope that the Minister received some notice of what I was going to raise. If not, as ever, I hope that he will helpfully write to me. In paragraph 42 of new Schedule 18, there is a reference to an amendment to Section 77 of the Freedom of Information Act. It deletes any reference to,
“section 7 of the Data Protection Act 1998”.
That is a deletion of a summary offence, which is rather baffling to many of us. It is about not keeping records. Many of us thought that, since there have been very few or no prosecutions under that section of the Freedom of Information Act, the answer would perhaps have been to ratchet up the penalty. At the moment, it is only a summary offence. Therefore, there is a six-month time limit, and it is difficult to get the information to hand in that period. If it was made a more serious offence, it would be rather more straightforward to prosecute in those circumstances. The Government, however, seem to have swept this off the statute book, buried in new Schedule 18. I hope that the Minister when he writes will elucidate clearly and perhaps say that in another part of the forest a criminal offence still lurks.
(7 years, 1 month ago)
Lords ChamberMy Lords, with so many codes of practice flying around it would not be hard to lose one in the crowd, but this one stands out. With this amendment, we are suggesting to the Government that there is a need at the top of the pyramid for a code of practice which looks at the whole question of data ethics and morality. We discussed this topic in earlier sittings of the Committee and I think we were of one mind that there was a gap in the overall architecture of the organisations supporting data processing, which concerned us, in the sense that there was a need for an expert body.
The body could be some sort of combination along the lines of the HFEA or the Committee on Climate Change. It would have a duty to look at the moral and ethical issues affecting data collection and use, and be able to do some blue-sky thinking and to provide a supervisory approach to the way in which thinking on these matters would have to go. We are all aware, as has been mentioned many times, that this is a fast-moving technology in an area full of change where people feel a bit concerned about where their data is and how it is being looked at. They are worried that they do not have sufficient control or understanding of the processes involved.
The amendment suggests to the Government a data ethics code of practice which I hope they will look at with some care. It would begin to provide a hand of support to individuals who are concerned about their data and how it has been processed. Under this code of practice the commissioner could set out the moral and ethical issues, rather than the practical day-to-day stuff. It would focus on duties of care and need to provide examples of where best practice can be found. It would increase the security of personal data and ensure that the access to its use and sharing were transparent, and that the purposes of data processing were communicated to data subjects.
Some codes of this type already exist. I think that the Royal Statistical Society has been behind a number of codes on the use of our overall statistics, such as that operated within the OSS. Having read that code, I was struck by how apposite it was to some of the issues faced in the data-processing community. Some of the wording of this amendment comes from that, while other wording comes from think tanks and others who are working in this field. It will also come as no surprise to the Committee that some of the detail in the code’s latter subsections about privacy settings, minimisation standards and the language of terms and conditions also featured in the proposed code recommended to the Committee by the noble Baroness, Lady Kidron, in relation to children’s use of the internet and how their data is treated. The amendment meets other interests and examples of activity. It seems to fulfil a need, which is becoming more pressing every day, and is ambitious in its attempt to try to make sure that whatever regulatory and statutory provisions are in place, there will also be a wider dimension employed, which I think we will increasingly be part of.
I do not expect the Government to accept the amendment tout court, because it needs a lot more work. I fully accept that the drafting is a bit rough at the edges, despite the fact that we spent a lot of time in the Public Bill Office trying to get it right. I have already explained that I am not very good at synthesising in the way that the Bill team obviously is. I have no doubt that when he responds the Minister will be able to encapsulate in a few choice words what I have been struggling to say over the past three or four sentences—he nods, so it is clearly going to hit me again. I hope that he will take away from this short debate that this is an issue that will not go away. It is an issue that we need to address, and it may be that the new body, which was, I think, generally accepted by the Committee as something that we should move to in short order, might take on this as its first task. I beg to move.
My Lords, the noble Lord, Lord Stevenson, is too modest about his drafting—I think that this is one of the most important amendments to the Bill that we have seen to date. I am just sorry that we were not quick enough off the mark to put our name to it. I do not know which hand the noble Lord, Lord Stevenson, is using—there seem to be a certain number of hands involved in this—but anybody who has read Jonathan Taplin’s Move Fast and Break Things, as I did over the weekend, would be utterly convinced of the need for a code of ethics in these circumstances. The increasing use of data in artificial intelligence and algorithms means that we need to be absolutely clear about the ethics involved in that application. The noble Lord, Lord Stevenson, mentioned a number of codes that he has based this amendment on, but what I like about it is that it does not predicate any particular code at this stage. It just talks about the desirable architecture of the code. That makes it a very robust amendment.
Like the noble Lord, I have looked at various other codes of ethics. For instance, the IEEE has rather a good code of ethics. This is all of a piece with the stewardship council, the data ethics body that we debated in the previous day in Committee. As the Royal Society said, the two go together. A code of ethics goes together with a stewardship council, data ethics committee or whatever one calls it. You cannot have one without the other. Going forward, whether or not we agree today on this amendment, it is very clear that we need to keep coming back to this issue because this is the future. We have to get it right, and we cannot prejudice the future by not having the right ethical framework.
My Lords, I support this amendment and identify myself totally with the remarks of the noble Lord, Lord Clement-Jones. I am trying to be practical, and I am possibly even pushing at an open door here. I have a facsimile of the 1931 Highway Code. The introduction by the then Minister says:
“By Section 45 of the Road Traffic Act, 1930, the Minister of Transport is directed to prepare a code of directions for the guidance of road users … During the passage of the Act through Parliament, the opinion was expressed almost universally … that much more could be done to ensure safety by the instruction and education of all road users as to their duties and obligations to one another and to the community as a whole”.
Those last few words are very important. This must be, in a sense, a citizens’ charter for users—a constantly updated notion—of the digital environment to be sure of their rights and of their rights of appeal against misuse. This is exactly where the Government have a duty of care to protect people from things they do not know about as we move into a very difficult, almost unknown digital environment. That was the thinking behind the 1931 Highway Code, and we could do a lot worse than do something similar. That is probably enough for now, but I will undoubtedly return to this on Report.
My Lords, I am very grateful to the noble Lord, Lord Stevenson, for tabling this amendment, which allows us to return to our discussions on data ethics, which were unfortunately curtailed on the last occasion. The noble Lord invited me to give him a few choice words to summarise his amendments. I can think of a few choice words for some of his other amendments, but today I agree with a lot of the sentiment behind this one. It is useful to discuss this very important issue, and I am sure we will return to it. The noble Lord, Lord Puttnam, brought the 1931 Highway Code into the discussion, which was apposite, as I think the present Highway Code is about to have a rewrite due to autonomous vehicles—it is absolutely right, as he mentioned, that these codes have to be future-proofed. If there is one thing we are certain of, it is that these issues are changing almost by the day and the week.
The noble Lord, Lord Stevenson, has rightly highlighted a number of times during our consideration of the Bill that the key issue is the need for trust between individuals and data controllers. If there is no trust in what is set up under the Bill, then there will not be any buy-in from the general public. The noble Lord is absolutely right on that. That is why the Government are committed to setting up an expert advisory body on data ethics. The noble Lord mentioned the HFEA and the Committee on Climate Change, which are interesting prior examples that we are considering. I mentioned during our last discussion that the Secretary of State was personally leading on this important matter. He is committed to ensuring that just such a body is set up, and in a timely manner.
However, although I agree with and share the intentions that the noble Lord has expressed through this amendment, which other noble Lords have agreed with, I cannot agree with the mechanism through which he has chosen to express them. When we previously debated this topic, I was clear that we needed to draw the line between the function of an advisory ethics body and the Information Commissioner. The proposed ethics code in this amendment is again straddling this boundary.
Our new data protection law as found in this Bill and the GDPR will already require data controllers to do many of the things found in this amendment. Securing personal data, transparency of processing, clear consent, and lawful sharing and use are all matters set out in the new law. The commissioner will produce guidance, for that is already one of her statutory functions and, where the law is broken, the commissioner will be well equipped with enforcement powers. The law will be clear in this area, so all this amendment will do is add a layer of complexity.
The Information Commissioner’s remit is to provide expert advice on applying data protection law. She is not a moral philosopher. It is not her role to consider whether data processing is addressing inequalities in society or whether there are public benefits in data processing. Her role is to help us comply with the law to regulate its operation, which involves fairly handling complaints from data subjects about the processing of their personal data by controllers and processors, and to penalise those found to be in breach. The amendment that the noble Lord has tabled would extend the commissioner’s remit far beyond what is required of her as a UK supervisory authority for data protection and, given the breadth of the code set out in his amendment, would essentially require the commissioner to become a regulator on a much more significant scale than at present.
This amendment would stretch the commissioner’s resources and divert from her core functions. We need to examine the ethics of how data is used, not just personal data. However, the priority for the commissioner is helping us to implement the new law to ensure that the UK has in place the comprehensive data protection regime that we need and to help to prepare the UK for our exit from the EU. These are massive tasks and we must not distract the commissioner from them.
There is of course a future role for the commissioner to work in partnership with the new expert group on ethics that we are creating. We will explore that further once we set out our plans shortly. It is also worth noting that the Bill is equipped to future-proof the commissioner to take on this role: under Clause 124, the Secretary of State may by regulation require the commissioner to produce appropriate codes of practice. While the amendment has an arbitrary shopping list, much of which the commissioner is tasked with already, the Bill allows for a targeted code to be developed as and when the need arises.
The Government recognise the need for further credible and expert advice on the broader issues of the ethical use of data. As I mentioned last week, it is important that the new advisory body has a clearly defined role focused on the ethics of data use and gaps in the regulatory landscape. The body will as a matter of necessity have strong relationships with the Information Commissioner and other bodies that have a role in this space. For the moment, with that in mind, I would be grateful if the noble Lord withdrew his amendment. As I say, we absolutely understand the reasons behind it and we have taken on board the views of all noble Lords in this debate.
My Lords, do the Minister or the Government yet have a clear idea of whether the power in the Bill to draw up a code will be invoked, or whether there will be some other mechanism?
At the moment, I do not think there is any anticipation for using that power in the near future, but it is there if necessary in the light of the broader discussions on data ethics.
So the Minister believes it is going to be the specially set-up data ethics body, not the powers under the Bill, that would actually do that?
I do not want to be prescriptive on this because the data ethics body has not been set up. We know where we think it is going, but it is still to be announced and the Secretary of State is working on this. The legal powers are in the Bill, and the data ethics body is more likely to be an advisory body.
(7 years, 1 month ago)
Lords ChamberI should notify the Committee that if Amendment 45B is agreed, I cannot call Amendments 46 to 50A by reason of pre-emption.
My Lords, the noble Earl, Lord Kinnoull, has clearly and knowledgeably introduced the amendment, which I strongly support. He made clear through his case studies the Bill’s potential impact on the insurance industry, and I very much hope that the Minister has taken them to heart. Processing special category data, including health data, is fundamental to calculating levels of risk, as the noble Earl explained, and to underwriting most retail insurance products. Such data is also needed for the administration of insurance policies, particularly claims handling.
The insurance industry has made the convincing case that if the implementation of the Bill does not provide a workable basis for insurers to process that data, it will interrupt the provision to UK consumers of retail insurance products such as health, life and travel insurance, and especially products with health-related consumer benefits, such as enhanced annuities. The noble Earl mentioned a number of impacts, but estimates suggest that, in the motor market alone, if this issue is not resolved, it could impact on about 27 million policies and see premiums rise by about 3% to 5%.
There is a need to process criminal conviction data for the purposes of underwriting insurance in, for instance, the motor insurance market. Insurers need to process data to assess risk and set the prices and terms for mainstream products such as motor, health and travel insurance.
The key issue of concern is that new GDPR standards for consent for special category data, including health, such as the right to withdraw consent without experiencing detriment, are incompatible with the uninterrupted provision of these products. As the noble Earl, Lord Kinnoull, has clearly stated, there is scope for a UK derogation represented by these amendments, which would be in the public interest, to allow processing of criminal conviction and special category data when it is necessary for arranging, underwriting and administering insurance and reinsurance policies and insurance and reinsurance policy claims. I very much hope that the Minister will take those arguments on board.
My Lords, the noble Earl, Lord Kinnoull, has done us a great favour in introducing with great skill these amendments, which get to the heart of problems with some of the language used in the Bill. We are grateful to him for going through and picking out the choices that were before the Government and the way their particular choices seem to roll back some of the advances made in the insurance industry in recent years. I look forward to the Minister’s response.
Our probing Amendment 47 in this group is on a slightly higher level. It is not quite as detailed—nor was it intended to be—as the one moved by the noble Earl. We were hoping to raise a more general question, to which I hope the Minister will be able to respond. Our concern, which meets the concerns raised by the noble Earl, Lord Kinnoull, and the noble Lord, Lord Clement-Jones, is where the Government want to get to on this. It must be true that insurance is one of the key problems facing many people in our country. It is the topic that will be discussed in the QSD in today’s dinner break as it bears heavily on financial inclusion issues. So many people in this country do not take out insurance, personal or otherwise, and suffer as a result. We have to be very careful as we take this forward as a social issue.
However, an open-ended derogation to allow those who wish to gather information to make a better insurance market surely also raises risks. If we are talking about highly personal profiling—we may not be because there are constraints in the noble Earl’s amendment—it would lead to a more efficient and cheaper insurance industry, but at what personal cost? For instance, if it is possible to pick up data from those who perhaps unadvisedly put on Facebook or Twitter how many times they get drunk—I am sure that is not unusual, particularly among the younger generation—information could be gathered for a profile that ought to be taken into account for their life, health or car insurance. I am not sure that we would be very happy with that.
Underlying our probing amendment is to ask the Minister to respond—it may be possible by letter rather than today—on protections the Government have in mind. What sort of stock points are there that we can rely on as we move forward in this area? As processing becomes more powerful and more data is available, pooled risks are beginning to look a little old-fashioned. The old traditional model under which insurance is gathered is that the more the pool is expanded, the risks are spread out more appropriately across everybody. The trouble is that the more we know, we will be including people who are perhaps more reckless and therefore skewing the pooling arrangements. We have to be careful about that.
There is obviously a social objective in having a more efficient and effective insurance market but this ought to be counterbalanced to make sure that those people who are vulnerable are not excluded or uninsurable as a result. The state could step in, obviously, and has done so, as we have been reminded already in our Committee discussions about the difficulty of getting insurance for those who build on flood plains. However that is not the point here. This is about general insurance across the range of current market opportunities being affected by the fact that we are not ensuring that the data gathered is both proportionate and correct in terms of what it provides for the individual data subjects concerned.
My Lords, I am grateful to all noble Lords who have spoken and for the opportunity to speak to Schedule 1 in relation to an industry in which I spent many years. I accept many of the things that the noble Earl, Lord Kinnoull, described and completely understand many of his points—and, indeed, many of the points that other noble Lords have made. As the noble Lord, Lord Clement-Jones, said, I have taken the noble Earl’s examples to heart, and I absolutely accept the importance of the insurance industry. The Government have worked with the Association of British Insurers and others to ensure that the Bill strikes the right balance between safeguarding the rights of data subjects and processing data without consent when necessary for carrying on insurance business—and a balance it must be. The noble Lord, Lord Stevenson, alluded to some of those issues when he took us away from the technical detail of his amendment to a higher plane, as always.
The noble Earl, Lord Kinnoull, and the noble Lords, Lord Clement-Jones and Lord Stevenson, have proposed Amendments 45B, 46A, 47, 47A, 48A and 50A, which would amend or replace paragraphs 14 and 15 of Schedule 1, relating to insurance. These amendments would have the effect of providing a broad basis for processing sensitive types of personal data for insurance-related purposes. Amendment 45B, in particular, would replace the current processing conditions for insurance business set out in paragraphs 14 and 15 with a broad condition covering the arrangement, underwriting, performance or administration of a contract of insurance or reinsurance, but the amendment does not provide any safeguards for the data subject.
Amendment 47 would amend the processing condition relating to processing for insurance purposes in paragraph 14. This processing condition was imported from paragraph 5 of the 2000 order made under the Data Protection Act 1998. Removal of the term might lessen the safeguards for data subjects, because insurers could potentially rely on the provisions even where it was reasonable to obtain consent. I shall come to the opinions of the noble Earl, Lord Erroll, on consent in a minute.
Amendments 46A, 47A, 48A and 50A are less sweeping, but would also remove safeguards and widen the range of data that insurers could process to far beyond what the current law allows. The Bill already contains specific exemptions permitting the processing of family health data to underwrite the insured’s policy and data required for insurance policies on the life of another or group contract. We debated last week a third amendment to address the challenges of automatic renewals.
These processing conditions are made under the substantial public interest derogation. When setting out the grounds for such a derogation, the Government are limited—this partly addresses the point made by the noble Lord, Lord Stevenson—by the need to meet the “substantial public interest test” in the GDPR and the need to provide appropriate safeguards for the data subject. A personal or private economic or commercial benefit is insufficient: the benefits for individuals or society need to significantly outweigh the need of the data subject to have their data protected. On this basis, the Government consider it difficult to justify a single broad exemption. Taken together, the Government remain of the view that the package of targeted exemptions in the Bill is sufficient and achieves the same effect.
Nevertheless, noble Lords have raised some important matters and the Government believe that the processing necessary for compulsory insurance products must be allowed to proceed without the barriers that have been so helpfully described. The common thread in these concerns is how consent is sought and given. The noble Earl, Lord Kinnoull, referred to that and gave several examples. The Information Commissioner has published draft guidance on consent and the Government have been in discussions with her office on how the impact on business can be better managed. We will ensure that we resolve the issues raised.
I say to the noble Earl, Lord Erroll, that consent is important and the position taken by the GDPR is valid. We do not have a choice in this: the GDPR is directly applicable and when you are dealing with data, it is obviously extremely important to get consent, if you can. The GDPR makes that a first line of defence, although it provides others when consent is not possible. As I say, consent is important and it has to be meaningful consent, because we all know that you can have a pre-tick box and that is not what most people nowadays regard as consent. Going back to the noble Earl, Lord Kinnoull—
My Lords, I am sorry to interrupt. The Minister mentioned the guidance from the Information Commissioner. From what he said, I assume he knows that the insurance industry does not believe that the guidance is sufficient; it is inadequate for its purposes. Is he saying that a discussion is taking place on how that guidance might be changed to meet the purposes of the insurance industry? If it cannot be changed, will he therefore consider amendments on Report?
Of course, it is not for us to tell the Information Commissioner what guidance to issue. The guidance that has been issued is not in all respects completely helpful to the insurance industry.
I agree; I think I mentioned compulsory classes before. Going back to the guidance, we are having discussions. We have already had constructive discussions with the noble Earl, and we will have more discussions on this subject with the insurance industry, in which he has indicated that he would like to take part. I am grateful to him for coming to see me last week.
My Lords, I am sorry to interrupt the Minister again but he is dealing with important concepts. Right at the beginning of his speech he said he did not think this could be covered by the substantial public interest test. Surely the continuance of insurance in all those different areas, not just for small businesses but for the consumer, and right across the board in the retail market, is of substantial public interest. I do not quite understand why it does not meet that test.
I may have misled the noble Lord. I did not say that it does not meet the substantial test but that we had to balance the need to meet the substantial public interest test in the GDPR and the need to provide appropriate safeguards for the data subject. I am not saying that those circumstances do not exist. There is clearly substantial public interest that, as we discussed last week, compulsory classes of insurance should be able to automatically renew in certain circumstances. I am sorry if I misled the noble Lord.
We realised that there are potentially some issues surrounding consent, particularly in the British way of handling insurance where you have many intermediaries, which creates a problem. That may also take place in other countries, so the Information Commissioner will also look at how they address these issues, because there is meant to be a harmonious regime across Europe. The noble Earl has agreed to come and talk to us, and I hope that on the basis of further discussions, he will withdraw his amendment.
We can break it down simply between compulsory and non-compulsory classes. Some classes may more easily fulfil the substantial public interest test than others. In balancing the needs, it goes too far to give a broad exemption for all insurance, so we are trying to create a balance. However, we accept that compulsory classes are important.
I am sure that the noble Earl, Lord Kinnoull, will come back at greater length on this. The issue that the Minister has outlined is difficult, partly because the Information Commissioner plays and will play such an important role in the interpretation of the Bill. When the Government consider the next steps and whether to table their own amendments or accept other amendments on Report, will they bring the Information Commissioner or her representative into the room? It seems that the guidance and the interaction of the guidance with the Bill—and, eventually, with the Act—will be of extreme importance.
I agree, which is why I mentioned the guidance that the Information Commissioner has already given. I am certainly willing to talk to her but it is not our place to order her into the room. However, we are constantly talking to her, and there is absolutely no reason why we would not do so on this important matter.
My Lords, if this amendment is agreed to, I cannot call Amendments 58 to 62 because of pre-emption.
I must say how delighted I am that on this occasion we had the noble Lord advocating his own amendment. I was nearly in the hot seat last week, but we have just avoided it. I was delighted at his powerful advocacy because of course the noble Lord is extraordinarily well informed on all matters to do with sport, and this goes to the heart of sport in terms of preventing cheats who prevent the rest of us enjoying what should be clean sport, however that may be defined. All I have to do is pick out one or two of the elements of what the noble Lord said in my supportive comments.
There is the fact that neither “doping” nor “sport” is defined in the Bill, as the noble Lord pointed out. There is no definition of the bodies to be covered by paragraph 21, which is extremely important. He also made an extraordinarily important point about UKAD. Naming UKAD in the Bill, as the amendment seeks to do, would add to its authority and allow it to carry out all the various functions that he outlined in his speech. If it is necessary to add other bodies, as he suggested, that should of course be considered.
The noble Lord’s reference to performance-enhancing substances, which again are mentioned in the amendment and included in the World Anti-Doping Code, ties the Bill together with that code and was very important as well. Finally, the point that he made about gender and the substances used in connection with gender change was bang up to the minute. That, too, must be covered by provisions such as this. So if the Minister is not already discussing these issues with the noble Lord, Lord Moynihan, I very much hope that he is about to and will certainly do so before Report.
My Lords, once again your Lordships’ House is very grateful to the noble Lord, Lord Moynihan, for raising this issue and, as the noble Lord, Lord Clement-Jones, said, for doing so in such a comprehensive way. It is in the context of the much wider range of issues that the noble Lord, Lord Moynihan, has been pursuing regarding how sport, gambling and fairness are issues that all need to be taken together. We have been supporting him on those issues, which need legislation behind them.
Noble Lords may not be aware that we have been slightly accused of taking our time over the Bill. I resist that entirely because we are doing exactly what we should be doing in your Lordships’ House: going through line-by-line scrutiny and making sure that the Bill is as good as it can be before it leaves this House. We saw the noble Lord, Lord Moynihan, at the very beginning of Committee and he then dashed off to Australia to do various things, no doubt not unrelated to sport. He has had time to come back and introduce these amendments—but, meanwhile, the noble Lord, Lord Clement-Jones, and I were debating who was going to pick the straw that would require us to introduce them. We were very lucky not to have to do so because they were introduced so well on this occasion.
Our amendment in this group is a probing amendment that picks up on some of the points already made. It raises the issue of why we are restricting this section of the Bill to “sport”—whatever that is. If we are concerned about performance enhancement, we have to look at other competitive arrangements where people gain an advantage because of a performance-enhancing activity such as taking drugs. For instance, in musical competitions, for which the prizes can be quite substantial, it is apparently possible to enhance one’s performance—perhaps in high trills on the violin or playing the piano more brilliantly—if you take performance-enhancing drugs. Is that not somehow seeking to subvert these arrangements? Since that is clearly not sport, is it not something that we ought to be thinking about having in the Bill as well? I say that because, although the narrow sections of the Bill that relate to sport are moving in the right direction, they do not go far enough. As a society, we are going to have to think more widely about this as we go forward.
(7 years, 1 month ago)
Lords ChamberMy Lords, the noble Lord, Lord Stevenson, has raised some important points, which refer back to our labour over the Digital Economy Bill. One particular point occurs to me in relation to the questions that he asked: have we made any progress towards anonymisation in age verification, as we debated at some length during the passage of that Bill? As I recall, the Government’s point was that they did not think it necessary to include anything in the Bill because anonymisation would happen. The Minister should engage with that important issue. The other point that could be made is about whether the Government believe that the amendment of the noble Lord, Lord Lucas, would help us towards that goal.
My Lords, as we have heard, Part 3 of the Digital Economy Act 2017 requires online providers of pornographic material on a commercial basis to institute appropriate age verification controls. My noble friend’s Amendment 71ZA seeks to allow the age verification regulator to publish regulations relating to the protection of personal data processed for that purpose. The amendment aims to provide protection, choice and trust in respect of personal data processed for the purpose of compliance with Part 3 of the 2017 Act.
I think that I understand my noble friend’s aim. It is a concern I remember well from this House’s extensive deliberations on what became the Digital Economy Act, as referred to earlier. We now have before us a Bill for a new legal framework which is designed to ensure that protection, choice and trust are embedded in all data-processing practices, with stronger sanctions for malpractice. This partly answers my noble friend Lord Elton, who asked what we would produce to deal with this problem.
Personal data, particularly those concerning a data subject’s sex life or sexual orientation, as may be the case here, will be subject to rigorous new protections. For the reasons I have just mentioned, the Government do not consider it necessary to provide for separate standards relating exclusively and narrowly to age verification in the context of accessing online pornography. That is not to say that there will be a lack of guidance to firms subject to Part 3 of the 2017 Act on how best to implement their obligations. In particular, the age verification regulator is required to publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as compliant.
As noble Lords will be aware, the British Board of Film Classification is the intended age verification regulator. I reassure noble Lords that in its preparations for taking on the role of age verification regulator, the BBFC has indicated that it will ensure that the guidance it issues promotes the highest data protection standards. As part of this, it has held regular discussions with the Information Commissioner’s Office and it will flag up any potential data protection concerns to that office. It will then be for the Information Commissioner to determine whether action or further investigation is needed, as is her role.
The noble Lord, Lord Clement-Jones, talked about anonymisation and the noble Lord, Lord Stevenson, asked for an update of where we actually were. I remember the discussions on anonymisation, which is an important issue. I do not have the details of exactly where we have got to on that subject—so, if it is okay, I will write to the noble Lord on that.
I can update the noble Lord, Lord Stevenson, to a certain extent. As I just said, the BBFC is in discussion with the Information Commissioner’s Office to ensure that best practice is observed. Age verification controls are already in place in other areas of internet content access; for example, licensed gambling sites are required to have them in place. They are also in place for UK-based video-on-demand services. The BBFC will be able to learn from how these operate, to ensure that effective systems are created—but the age verification regulator will not be endorsing a list of age verification technology providers. Rather, the regulator will be responsible for setting guidance and standards on robust age verification checks.
We continue to work with the BBFC in its engagement with the industry to establish the best technological solutions, which must be compliant with data protection law. We are aware that such solutions exist, focusing rightly on verification rather than identification—which I think was the point made by the noble Lord, Lord Clement-Jones. If I can provide any more detail in the follow-up letter that I send after each day of Committee, I will do so—but that is the general background.
Online age verification is a rapidly growing area and there will be much innovation and development in this field. Industry is rightly putting data privacy and security at the forefront of its design, and this will be underscored by the new requirements under the GDPR. In view of that explanation, I hope that my noble friend will be able to withdraw his amendment.
My Lords, in moving Amendment 74, I will also speak to Amendments 74A, 75, 77, 119, 133A, 134 and 183—I think I have encompassed them all; at least I hope I have. In a way this is an extension of the very interesting debate that we heard on Amendment 71A, but further down the pipeline, so to speak. This group contains a range of possible and desirable changes to the Bill relating to artificial intelligence and the use of algorithms.
Data has been described, not wholly accurately, as the oil of artificial intelligence. With the advent of AI and its active application to datasets, it is vital that we strike the right balance in protecting privacy and the use of personal data. Indeed, the Minister spoke about that balance in that debate. Above all, we need to be increasingly aware of unintended discrimination where an element of a decision involves an algorithm. If a particular system learns from a dataset that contains biases, such as associating female names with family roles and male names with careers, it is likely to reproduce them in its decisions. One way of helping to identify and rectify bias is to ensure that such algorithms are transparent, so that it is possible to see not only what data is being used but the steps being taken to process that data in coming to a particular conclusion.
In all this, there is the major risk that we do not challenge computer-aided decision-making. To some extent, this is recognised by article 22 of the GDPR, which at least gives the right of explanation where there is fully automated decision-taking, and it is true that in certain respects, Clause 13 amplifies article 22. For instance, article 22 does not state what safeguards need to be in place; it talks just about proper safeguards. In the Bill, it is proposed that, after a decision has been made, the individual has to be informed of the outcome, which is better than what the GDPR currently offers. It also states that data subjects should have the right to ask that the decision be reconsidered or that the decision not be made by an algorithm. There is also the requirement, in certain circumstances, for companies and public bodies to undertake data protection impact assessment under Clause 62. There are also new provisions in the GDPR for codes of conduct and certification, so that if an industry is moving forward on artificial intelligence in an application, the ICO can certify the approach that the industry is taking on fairness in automated decision-taking.
My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.
The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.
The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.
The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.
I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,
“based solely on automated processing”,
to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.
In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—
My Lords, I hesitate to interrupt the Minister, but it is written down in the recital that such a measure,
“should not concern a child”.
The whole of that recital is to do with automated processing, as it is called in the recital. The interpretation of that recital is going to be rather important.
My Lords, I was coming to recital 71. In the example I gave, it seems odd to suggest that the NHS could at some future point use automated decision-making with appropriate safeguards to decide on the eligibility for a particular vaccine of an 82 year-old, but not a two year-old.
The noble Lord referred to the rather odd wording of recital 71. On this point, we agree with the Article 29 working party—the group of European regulators—that it should be read as discouraging as a matter of best practice automated decision-making with significant effects on children. However, as I have already said, there can and will be cases where it is appropriate, and the Bill rightly makes provision for those.
Would the Minister like to give chapter and verse on how that distinction is made?
I think that “chapter and verse” implies “written”—and I will certainly do that because it is important to write to all noble Lords who have participated in this debate. As we have found in many of these areas, we need to get these things right. If I am to provide clarification, I will want to check—so I will take that back.
I apologise for interrupting again. This is a bit like a dialogue, in a funny sort of way. If the Minister’s notes do not refer to the Article 29 working party, and whether or not we will continue to take guidance from it, could he include that in his letter as well?
I will. I had some inspiration from elsewhere on that very subject—but it was then withdrawn, so I will take up the offer to write on that. However, I take the noble Lord’s point.
We do not think that Amendment 75 would work. It seeks to prevent any decision being taken on the basis of automated decision-making where the decision would “engage” the rights of the data subject under the Human Rights Act. Arguably, such a provision would wholly negate the provisions in respect of automated decision-making as it would be possible to argue that any decision based on automated decision-making at the very least engaged the data subject’s right to have their private life respected under Article 8 of the European Convention on Human Rights, even if it was entirely lawful. All decisions relating to the processing of personal data engage an individual’s human rights, so it would not be appropriate to exclude automated decisions on this basis. The purpose of the Bill is to ensure that we reflect processing in the digital age—and that includes automated processing. This will often be a legitimate form of processing, but it is right that the Bill should recognise the additional sensitivities that surround it. There must be sufficient checks and balances and the Bill achieves this in Clauses 13 and 48 by ensuring appropriate notification requirements and the right to have a decision reassessed by non-automated means.
My Lords, I rather hope that the Minister has not been able to persuade noble Lords opposite. Certainly, I have not felt myself persuaded. First, on the point about “solely”, in recruiting these days, when big companies need to reduce a couple of thousand applications to 100, the general practice is that you put everything into an automated process—you do not really know how it works—get a set of scores at the end and decide where the boundary lies according to how much time you have to interview people. Therefore, there is human intervention—of course there is. You are looking at the output and making the decision about who gets interviewed and who does not. That is a human decision, but it is based on the data coming out of the algorithm without understanding the algorithm. It is easy for an algorithm to be racist. I just googled “pictures of Europeans”. You get a page of black faces. Somewhere in the Google algorithm, a bit of compensation is going on. With a big algorithm like that, they have not checked what the result of that search would be, but it comes out that way. It has been equally possible to carry out searches, as at various times in the past, which were similarly off-beam with other groups in society.
When you compile an algorithm to work with applications, you start off, perhaps, by looking at, “Who succeeds in my company now? What are their characteristics?”. Then you go through and you say, “You are not allowed to look at whether the person is a man or a woman, or black or white”, but perhaps you are measuring other things that vary with those characteristics and which you have not noticed, or some combinations. An AI algorithm can be entirely unmappable. It is just a learning algorithm; there is no mental process that a human can track. It just learns from what is there. It says, “Give me a lot of data about your employees and how successful they are and I will find you people like that”.
At the end of the day, you need to be able to test these algorithms. The Minister may remember that I posed that challenge in a previous amendment to a previous Bill. I was told then that a report was coming out from the Royal Society that would look at how we should set about testing algorithms. I have not seen that report, but has the Minister seen it? Does he know when it is coming out or what lines of thinking the Royal Society is developing? We absolutely need something practical so that when I apply for a job and I think I have been hard done by, I have some way to do something about it. Somebody has to be able to test the algorithm. As a private individual, how do you get that done? How do you test a recruitment algorithm? Are you allowed to invent 100 fictitious characters to put through the system, or should the state take an interest in this and audit it?
We have made so much effort in my lifetime and we have got so much better at being equal—of course, we have a fair way to go—doing our best continually to make things better with regard to discrimination. It is therefore important that we do not allow ourselves to go backwards because we do not understand what is going on inside a computer. So absolutely, there has to be significant human involvement for it to be regarded as a human decision. Generally, where there is not, there has to be a way to get a human challenge—a proper human review—not just the response, “We are sure that the system worked right”. There has to be a way round which is not discriminatory, in which something is looked at to see whether it is working and whether it has gone right. We should not allow automation into bits of the system that affect the way we interact with each other in society. Therefore, it is important that we pursue this and I very much hope that noble Lords opposite will give us another chance to look at this area when we come to Report.
My Lords, I thank all noble Lords who spoke in the debate. It has been wide-ranging but extremely interesting, as evidenced by the fact that at one point three members of the Artificial Intelligence Select Committee were speaking. That demonstrates that currently we live, eat and breathe artificial intelligence, algorithms and all matters related to them. It is a highly engaged committee. Of course, whatever I put forward from these Benches is not—yet—part of the recommendations of that committee, which, no doubt, will report in due course in March.
I highlight that we do not disagree with that. I will study carefully what my noble friend Lord Lucas said. We agree that it is important that privacy rights continue to be protected, and we do not expect data subjects to have their lives run by computer alone. That is exactly why the Bill creates safeguards: to make sure that individuals can request not to be the subject of decisions made automatically if it might have a significant legal effect on them. They are also allowed to demand that a human being participate meaningfully in those decisions that affect them. I will look at what my noble friend said and include that in my write-round. However, as I said, we do not disagree with that. The illusion that we have got to a stage where our lives will be run unaccountably by computers is exactly what the Bill is trying to prevent.
My Lords, I would not want to give that impression. None of us are gloom merchants in this respect. We want to be able to harness the new technology in a way that is appropriate and beneficial for us, and we do that by setting the right framework in data protection, ethical behaviour and so on.
I am grateful to the Minister for engaging in the way he has on the amendments. It is extremely important to probe each of those areas of Clauses 13, 47 and 48. For instance, there are lacunae. The Minister talked about the right to be informed and the right to challenge, and so on, and said that these provided adequate and proportional safeguards, but the right to explanation is not absolutely enshrined, even though it is mentioned in the GDPR. So in some areas we will probe on that.
My Lords, if it is mentioned in the GDPR, then it is there.
Yes, my Lords, but it is in the recital, so I think we come back again to whether the recitals form part of the Bill. That is what I believe to be the case. I may have to write to the Minister. Who knows? Anything is possible.
One of the key points—raised by the noble Lord, Lord Lucas—is the question of human intervention being meaningful. To me, “solely”, in the ordinary meaning of the word, does not mean that human intervention is there at all, and that is a real worry. The writ of the article 29 working group may run until Brexit but, frankly, after Brexit we will not be part of the article 29 working group, so what interpretation of the GDPR will we have when it is incorporated into UK domestic law? If those rights are not to be granted, the interpretation of “solely” with the absolute requirement of human involvement needs to be on the face of the Bill.
As far as recital 71 is concerned, I think that the Minister will write with his interpretation and about the impact of the article 29 working group and whether we incorporate its views. If the Government are not prepared to accept that the rulings of the European Court of Justice will be effective in UK law after Brexit, I can only assume that the article 29 working group will have no more impact. Therefore, there is a real issue there.
I take the Minister’s point about safeguards under the Equality Act. That is important and there are other aspects that we will no doubt wish to look at very carefully. I was not overly convinced by his answer to Amendment 75, spoken to by the noble Baroness, Lady Jones, and my noble friend Lady Hamwee, because he said, “Well, it’s all there anyway”. I do not think we would have had to incorporate those words unless we felt there was a gap in the way the clause operated.
I will not take the arguments any further but I am not quite as optimistic as the Minister about the impact of that part of the Bill, and we may well come back to various forms of this subject on Report. However, it would be helpful if the Minister indicated the guidance the ICO is adopting in respect of the issue raised in Amendment 153A. When he writes, perhaps he could direct us to those aspects of the guidance that will be applicable in order to help us decide whether to come back to Amendment 153A. In the meantime, I beg leave to withdraw.
My Lords, it always used to be said that reaching the end of your Lordships’ day was the graveyard slot. This is a bit of a vice slot. You are tempted by the growing number of people coming in to do a bit of grandstanding and to tell them what they are missing in this wonderful Bill that we are discussing. You are also conscious that the dinner hour approaches—and I blame the noble Baroness, Lady Hamwee, for that. All her talk of dining in L’Algorithme, where she almost certainly had a soup, a main course and a pudding, means that it is almost impossible to concentrate for the six minutes that we will be allowed—with perhaps a few minutes more if we can be indulged—to finish this very important group. It has only one amendment in it. If noble Lords did not know that, I bet that has cheered them up. I am happy to say that it is also a réchauffage, because we have already discussed most of the main issues, so I will be very brief in moving it.
It is quite clear from our discussion on the previous group that we need an ethics body to look at the issues that we were talking about either explicitly or implicitly in our debates on the previous three or four groups and to look also at moral and other issues relating to the work on data, data protection, automatics and robotics, and everything else that is going forward in this exciting field. The proposal in Amendment 78A comes with a terrific pedigree. It has been brought together by members of the Royal Society, the British Academy, the Royal Statistical Society and the Nuffield Trust. It is therefore untouchable in terms of its aspirations and its attempt to get to the heart of what should be in the contextual area around the new Bill.
I shall not go through the various points that we made in relation to people’s fears, but the key issue is trust. As I said on the previous group, if there is no trust in what is set up under the Bill, there will not be a buy-in by the general public. People will be concerned about it. The computer will be blamed for ills that are not down to it, in much the same way that earlier generations always blamed issues external to themselves for the way that their lives were being lived. Shakespeare’s Globe was built outside the city walls because it was felt that the terribly dangerous plays that were being put on there would upset the lieges. It is why penny dreadfuls were banned in the early part of the last century and why we had a fight about video nasties. It is that sort of approach and mentality that we want to get round to.
There is good—substantial good—to be found in the work on automation and robotics that we are now seeing. We want to protect that but in the Bill we are missing a place and a space within which the big issues of the day can be looked at. Some of the issues that we have already talked about could easily fit with the idea of an independent data ethics advisory board to monitor further technical advances in the use and management of personal data and the implications of that. I recommend this proposal to the Committee and beg to move.
My Lords, the noble Lord, Lord Stevenson, has been admirably brief in the pre-dinner minutes before us and I will be brief as well. This is a very important aspect of the debate and, despite the fact that we will be taking only a few minutes over it, I hope that we will return to it at a future date.
I note that the Conservative manifesto talked about a data ethics body, and this is not that far away from that concept. I think that the political world is coalescing around the idea of an ethics stewardship body of the kind recommended by the Royal Society and the British Academy. Whatever we call it—a rose by any other name—it will be of huge importance for the future, perhaps not as a regulator but certainly as a setter of principles and of an ethical context in which AI in particular moves forward.
The only sad thing about having to speed up the process today is that I am not able to take full advantage of the briefing put forward by the Royal Society. Crucially, it recommends two things. The first is:
“A set of high-level principles to help visibly shape all forms of data governance and ensure trustworthiness and trust in the management and use of data as a whole”.
The second is:
“A body to steward the evolution of the governance landscape as a whole. Such a stewardship body would be expected to conduct expert investigation into novel questions and issues, and enable new ways to anticipate the future consequences of today’s decisions”.
This is an idea whose time has come and I congratulate the noble Lords, Lord Stevenson and Lord Kennedy, on having tabled the amendment. I certainly think that this is the way forward.
My Lords, the noble Lord, Lord Stevenson, has raised the important issue of data ethics. I am grateful to everyone who has spoken on this issue tonight and has agreed that it is very important. I assure noble Lords that we agree with that. We had a debate the other day on this issue and I am sure we will have many more in the future. The noble Lord, Lord Puttnam, has been to see me to talk about this, and I tried to convince him then that we were taking it seriously. By the sound of it, I am not sure that I completely succeeded, but we are. We understand the points he makes, although I am possibly not as gloomy about things as he is.
We are fortunate in the UK to have the widely respected Information Commissioner to provide expert advice on data protection issues—I accept that that advice is just on data protection issues—but we recognise the need for further credible and expert advice on the broader issue of the ethical use of data. That is exactly why we committed to setting up an expert advisory data ethics body in the 2017 manifesto, which, I am glad to hear, the noble Lord, Lord Clement-Jones, read carefully.
We like to hold the Government to their manifesto commitments occasionally.
Tonight the noble Lord can because the Secretary of State is leading on this important matter. She is as committed as I am to ensuring that such a body is set up shortly. She has been consulting widely with civil society groups, industry and academia, some of which has been mentioned tonight, to refine the scope and functions of the body. It will work closely with the Information Commissioner and other regulators. As the noble Lords, Lord Clement-Jones and Lord Patel, mentioned, it will identify gaps in the regulatory landscape and provide Ministers with advice on addressing those gaps.
It is important that the new advisory body has a clearly defined role and a strong relationship to other bodies in this space, including the Information Commissioner. The Government’s proposals are for an advisory body which may have a broader remit than that suggested in the amendment. It will provide recommendations on the ethics of data use in gaps in the regulatory landscape, as I have just said. For example, one fruitful area could be the ethics of exploiting aggregated anonymised datasets for social and commercial benefit, taking into account the importance of transparency and accountability. These aggregated datasets do not fall under the legal definition of personal data and would therefore be outside the scope of both the body proposed by the noble Lord and, I suspect, this Bill.
Technically, Amendment 78 needs to be more carefully drafted to avoid the risk of non-compliance with the GDPR and avoid conflict with the Information Commissioner. Article 51 of the GDPR requires each member state to appoint one or more independent public authorities to monitor and enforce the GDPR on its territory as a supervisory authority. Clause 113 makes the Information Commissioner the UK’s sole supervisory authority for data protection. The functions of any advisory data ethics body must not cut across the Information Commissioner’s performance of its functions under the GDPR.
The amendment proposes that the advisory board should,
“monitor further technical advances in the use and management of personal data”.
But one of the Information Commissioner’s key functions is to
“keep abreast of evolving technology”.
That is a potential conflict we must avoid. The noble Lord, Lord Patel, alluded to some of the conflicts.
Nevertheless, I agree with the importance that noble Lords place on the consideration of the ethics of data use, and I repeat that the Government are determined to make progress in this area. However, as I explained, I cannot agree to Amendment 78 tonight. Therefore, in the light of my explanation, I hope the noble Lord will feel able to withdraw it.
(7 years, 1 month ago)
Lords ChamberMy Lords, this amendment arises from concerns about the narrowness of the derogations based on article 89 of the GDPR for research statistics and archiving expressed by a number of organisations, notably techUK. The argument is that there should be a derogation similar to Section 33 of the Data Protection Act 1998. That Act makes provision for exemptions for research and development where suitable safeguards are in place. The GDPR limits this to scientific and historical research, but member states are able to legislate for additional exemptions where safeguards are in place.
The organisation techUK and others believe that the Bill’s provision for scientific and historical research should be broadened, involving the same provisions as Section 33 of the Data Protection Act 1998, and that the definition of scientific and historical research needs clarification. For example, it is not clear whether it would include computer science engineering research. I very much hope that the Minister will be able to clarify that. I recognise that the amendment leads the line in this group but may not be followed in exactly the same way. I beg to move.
My Lords, I shall speak to Amendment 86BA, in my name. It concerns the application of data protection principles in the context of the law of trusts. The law has long recognised that a trustee is not obliged to disclose to a beneficiary the trustee’s confidential reasons for exercising or not exercising a discretionary power. This is known as the Londonderry principle, named after a case decided by the Court of Appeal, reported in 1965, Chancery Division, page 9.1.8. The rationale of this principle was helpfully summarised by Mr Justice Briggs—recently elevated to the Supreme Court—in the case of Breakspear v Ackland, 2009, Chancery, page 32, at paragraph 54.
The principle is that the exercise by trustees of their discretionary powers is confidential. It is in the interests of the beneficiaries, because it enables the trustees to make discreet but thorough inquiries as to the competing claims for consideration for benefit. Mr Justice Briggs added that such confidentiality also advances the proper interests of the administration of trusts, because it reduces the scope for litigation about how trustees have exercised their discretion, and encourages suitable people to accept office as trustees, undeterred by a concern that their discretionary deliberations might be challenged by disappointed or hostile beneficiaries and that they will be subject to litigation in the courts.
There is, of course, a public interest here, which is protected by the inherent jurisdiction of the court to supervise and, where appropriate, intervene in the administration of trusts, as the noble and learned Lord, Lord Walker of Gestingthorpe, stated for the Judicial Committee of the Privy Council in Schmidt v Rosewood Trust Ltd, 2003, 2 AC 709.
The problem is that, as presently drafted, the Bill would confer a right on beneficiaries to see information about themselves unless a specific exemption is included. A recent Court of Appeal judgment in Dawson-Damer v Taylor Wessing, 2017, EWCA Civ 74, drew attention to the general applicability of data protection law in this context unless a specific exemption is enacted.
My understanding, which is indirect—I declare an interest as a barrister, but this is not an area in which I normally practise—is that in other jurisdictions such as Jersey, the data protection legislation contains a statutory restriction on the rights of a data subject to make a subject access request where that would intrude on the trustees’ confidentiality under the Londonderry principle. Indeed, I am told that those who practise in this area are very concerned that offshore trustees and offshore professionals who provide trust services are already actively encouraging the transfer of trust business away from this jurisdiction because of the data protection rights which apply here, and which will apply under the Bill.
The irony is that the data protection law is driving trust business towards less transparent offshore jurisdictions and away from the better regulated English trust management businesses. I have received persuasive representations on this subject from the Trust Law Committee, a group of leading academics and practitioners, and I acknowledge the considerable assistance I have received on this matter from Simon Taube QC and James MacDougald.
This is plainly a very technical matter, but it is one of real public interest. I hope that the Minister will be able to consider this issue favourably before Report.
My Lords, I thank the Minister for that tour de force. This group is an extraordinary collection of different aspects such as research trusts and professional privilege. He even shed light on some opaque amendments to opaque parts of the Bill in dealing with Amendments 86A, 86B and 86C. The noble Lord, Lord Griffiths, was manful in his description of what his amendments were designed to do. I lost the plot fairly early on.
I thank the Minister particularly for his approach to the research aspect. However, we are back again to the recitals. I would be grateful if he could give us chapter and verse on which recitals he is relying on. He said that without the provisions of the Bill that we find unsatisfactory, research would be crippled. There is a view that he is relying on some fair stretching of the correct interpretation of the words “scientific” and “historical”, especially if it is to cover the kinds of things that the noble Lord, Lord Lucas, has been talking about. Many others are concerned about other forms of research, such as cyber research. There are so many other aspects. TechUK does not take up cudgels unless it is convinced that there is an underlying problem. This brings us back, again, to the question of recitals not being part of the Bill—
I support the noble Lord on this. Coming back to his earlier example, if you were told a sandwich was solely made of vegetable, the Minister is saying that that means it has not got much meat in it. This is Brussels language. I do not think it is the way in which our courts will interpret these words when we have sole control of them. If, as I am delighted to learn, we are going to implement our 2017 manifesto in its better bits, including Brexit, this is something we will have to face up to. This appears to be another occasion where “scientific” does not bear the weight the Bill is trying to put on it. It is not scientific research which is happening with the NPD. It is research, but it is not scientific.
I agree with that. Again we are relying on the interpretation in whichever recital the Minister has in his briefing. It would be useful to have a letter from him on that score and a description of how it is going to be binding. How is that interpretation which he is praying in aid in the recitals going to be binding in future on our courts? The recitals are not part of the Bill. We probably talked about this on the first day.
This was included in the letter I was sent today. I am afraid the noble Lord has not got it. The noble Lord, Lord Kennedy, helpfully withdrew his amendment before I was able to say anything the other night but the EU withdrawal Bill will convert the full text of direct EU instruments into UK law. This includes recitals, which will retain their status as an interpretive aid.
My Lords, we will see if the EU withdrawal Bill gets passed, but that is a matter for another day.
I thank the Minister for his remarks. There are many aspects of his reply which Members around the House will wish to unpick.
Perhaps I may pursue this for a second. It is late in the evening and I am not moving fast enough in my brain, but the recitals have been discussed time and again and it is great that we are now getting a narrow understanding of where they go. I thought we were transposing the GDPR, after 20 May and after Brexit, through Schedule 6. However, Schedule 6 does not mention the recitals, so if the Minister can explain how this magic translation will happen I will be very grateful.
I knew I was slow. We are moving to applied GDPR; that is correct. The applied GDPR, as I read it in the book—that great wonderful dossier that I have forgotten to table; I am sure the box can supply it when we need it—does not contain the recitals.
My Lords, just to heap Pelion on Ossa, I assume that until 29 March the recitals are not part of UK law.
They will be part of UK law, because the withdrawal Bill will convert the full text into UK law. There will of course be a difference between the recitals and the articles; it will be like a statutory instrument, where the Explanatory Memorandum is part of the text of the instrument.
May I add to this fascinating debate? Does this not illustrate one of the problems of the withdrawal Bill—that in many areas, of which this is one, there will be two potentially conflicting sources of English law? There will be this Act, on data protection, and the direct implementation through the EU withdrawal Bill on the same subject. The two may conflict because this Act will not contain the recitals.
My Lords, all I can say is that I do not know how the legal profession will cope in the circumstances.
One thing we can all be certain of is that the legal profession will cope.
The Minister will be delighted to hear that I will speak only briefly to this amendment, because I do not want to steal my noble friend Lady Hamwee’s thunder. This amendment would remove exemption to data subjects’ rights where personal data is being processed for the maintenance of effective immigration control or for the investigation or detection of activities that would undermine it. The amendment would remove paragraph 4 of Schedule 2 in its entirety. There is no attempt to define this new objective; nowhere in the Bill or its Explanatory Notes are notions of effective immigration control, or the activities requiring its maintenance, defined.
The immigration exemption is new in the Bill; there was no direct equivalent under the Data Protection Act 1998. This is the broad and wide-ranging exemption that is open to abuse. The exemption should be removed altogether, as there are other exemptions in the Bill that the immigration authorities can, and should, seek to rely on for the processing of personal data in accordance with their statutory duties and functions. The current provision, under the heading “Immigration”, removes all rights from a data subject that the Home Office wishes it did not have. Such removals are not restricted to those who have been found guilty of immigration offences, but apply to every data subject, including Home Office clerical errors. It is exactly those errors that data protection regulates.
In particular, there is a concern that the application of the effective immigration control exemption will become an administrative device to disadvantage data subjects using the immigration appeals process. Since the exemption has nothing to do with crime, national security, public safety or the protection of sources, such a prospect appears a distinct possibility without a rational explanation. The immigration authorities should be able to justify the inclusion of this exemption on the basis of hard evidence. The Home Office should be able to provide examples of subject access requests where personal data were released to the detriment of the public interest.
This is not the first time the Government have attempted to limit data protection rights on immigration control grounds. Clause 28 of the Data Protection Bill 1983 had an identical aim, setting out broad exemptions to data subject rights on grounds of crime, national security and immigration control. The Data Protection Committee, then chaired by Sir Norman Lindop, said that the clause would be,
“a palpable fraud upon the public if … allowed to become law”,
because it allowed data acquired for one purpose to be processed for another. In the House of Lords, my late and much-missed noble friend Lord Avebury mounted a robust and ultimately successful opposition to Clause 28 in 1983. He raised concerns almost synonymous with those we raise today. His objections and those of several Members of the House have the same resonance now as they did then. I beg to move.
I thank my noble friend for that. In the meantime, I think my words should be reread, particularly my point about it not being a wholesale carve-out but quite a narrow exemption. I will write to noble Lords. I thought I might home in on one question that the noble Baroness, Lady Hamwee, asked about relying on this in the investigation, detection and prevention of crime. Of course, that is not always the correct and proportionate response to persons who are in the UK without lawful authority and may not be the correct remedy. I will write to noble Lords, and I hope that the noble Lord will feel happy to withdraw the amendment.
My Lords, I thank the Minister. For a Home Office Minister she has a wonderful ability to create a sense of reassurance, which is quite dangerous. I am afraid that for all her well-chosen words, these Benches are not convinced. In particular, I noticed that she started off by saying, “This is only a very limited measure; it does not set aside everything”. But paragraph 1 sets aside nine particular aspects, all of which are pretty important. This provision is not a pussycat; it is very important.
I thank all those who spoke, including the noble Baroness, Lady Jones, and the noble Lord, Lord Lucas. I thought the support from the noble Lord, Lord Kennedy, for this amendment—I called him the right name this time—was rather more equivocal, and I hope he has not been persuaded by the noble Baroness’s siren song this evening. This is a classic example of the Home Office dusting off and taking off the shelf a provision which it has been dying to put on the statute book for years. The other rather telling point is that the noble Baroness said there is express provision for such derogation in the GDPR. But that is no reason to adopt it—just because it is possible, it is not necessarily desirable. But no, they say, let us adopt a nice derogation of this kind when it is actually not necessary.
As my noble friend pointed out, the Minister has not actually adduced any example which was not covered by existing exemptions, for instance, criminal offences. We will read with great care what the Minister has said, but I do not think that the “Why now?” question has really been answered this evening. In the meantime, I beg leave to withdraw the amendment.
(7 years, 1 month ago)
Lords ChamberMy Lords, I beg to move Amendment 21A and also speak to Amendment 66A. I also support Amendments 41 and 44, but my noble friend Lord McNally will speak in support of those.
The issue in question is the need for a lawful basis for biometric data used in the context of identity verification and authentication to increase security. Biometric data changes its status under the GDPR and becomes a new category of sensitive data. That narrows the lawful basis on which companies can collect and use biometric data, and it makes this processing of data difficult or impossible because the only lawful basis available is consent, which is not appropriate or feasible in the circumstances.
Biometrics are increasingly being used in different sectors for identity verification and authentication, both as a security measure and to provide greater identity assurance. I am sure that anybody who has used the fingerprint security aspect of an iPad will be aware of that. Employers are also increasingly using biometric access controls for premises or parts of premises that require high security levels and access audit trails. Organisations using biometrics for additional security and assurance also need to keep their mechanisms up to date, and continually test and develop ways in which to prevent bad actors from hacking or gaming their systems. That research and development activity also requires biometric data processing and can involve AI or machine learning to train and test systems.
The Bill has a fraud prevention lawful basis for processing sensitive data, under a heading of “substantial public interest”. However, even assuming that the Bill is clarified and the fraud prevention lawful basis is available to use without having to satisfy an additional “substantial public interest” test, it is not suitable for the biometric uses described. The problem is the risk that necessary and desirable processing of biometric data will not be possible. Increased security benefits everyone, and it would not be desirable for the law protecting the use of personal data to be the barrier to organisations implementing better security for individuals.
The solution is that we acknowledge that the GDPR allows additional lawful bases for processing sensitive data. Specifically, Article 9(4) allows member states to add lawful bases for processing biometric, genetic or health data. The essence is that we use the option available under that article to add a lawful basis, as set out in the amendments. The amendments may not be technically perfect, but I hope that the Minister will agree that they are heading in the right direction. The proposed additional lawful basis covers three biometric data processing activities, described above. There are already safeguards for individuals in the GDPR regarding biometric data processing, as any large-scale processing of sensitive data is subject to a data protection impact assessment, which would be the case for identity verification or authentication as an integral and ongoing security or assurance feature of the service that the individual has chosen to use. The proposed amendment would also introduce this safeguard as a requirement for employee biometric access control processing. I beg to move.
I agree. I have the same. You have to put in your numerical password every so often just to check that you have still got the same finger. Technically, you might not have.
The amendments also seek to permit the processing of such data when biometric identification devices are installed by employers to allow employees to gain access to work premises or when the controller is using the data for internal purposes to improve ID verification mechanisms. I am grateful to the noble Lord for raising this important issue because the use of biometric verification devices is likely only to increase in the coming years. At the moment, our initial view is that, given the current range of processing conditions provided in Schedule 1 to the Bill, no further provision is needed to facilitate the activities to which the noble Lord referred. However, this is a technical issue and so I am happy to write to the noble Lord to set out our reasoning on that point. Of course, this may not be the case in relation to the application of future technology, and we have already discussed the need for delegated powers in the Bill to ensure that the law can keep pace. I think we will discuss that again in a later group.
On this basis, I hope I have tackled the noble Lord’s concerns, and I would be grateful if he will withdraw the amendment.
My Lords, as usual the noble Lord, Lord Maxton, has put his finger on the problem. If we have iris recognition, he will keep his eye on the matter.
I thank the Minister for his explanation of the multifarious amendments and welcome the maiden speech from the Front Bench by the noble Lord, Lord Griffiths. I do not think I can better my noble friend Lord McNally’s description of his ascent to greatness in this matter. I suspect that in essence it means that the noble Lord, Lord Griffiths, like me, picks up all the worst technical amendments which are the most difficult to explain in a short speech.
I thought the Minister rather short-changed some of the amendments, but I will rely on Hansard at a later date, and I am sure the Opposition Front Bench will do the same when we come to it. The particular area where he was disappointing was on what you might call the Thomson Reuters perspective, and I am sure that we will want to examine very carefully what the Minister had to say because it could be of considerable significance if there is no suitable exemption to allow that kind of fraud prevention to take place. Although he said he had an open mind, I was rather surprised by his approach to Amendments 45A and 64 which were tabled by the noble Baroness, Lady Neville-Jones. One will have to unpick carefully what he said.
The bulk of what I want to respond to is what the Minister said about biometrics. I took quite a lot of comfort from what he said because he did not start quoting chapter and verse at me, which I think means that nobody has quite yet worked out where this biometric data fits and where there might be suitable exemptions. There is a general feeling that somewhere in the Bill or the schedules we will find something that will cover it. I think that may be an overoptimistic view, but I look forward to receiving the Minister’s letter. In the meantime, I beg leave to withdraw the amendment.
My Lords, I have two sets of amendments in this group. The first ones are actually amendments to that of the noble Lord, Lord Arbuthnot, because, like him, I think it would be useful, given the range of delegated powers within the Bill, if we wrote the super-affirmative resolution into the Bill. If we do not succeed in greatly reducing the amount of delegated legislation that is permitted under the Bill—although I hope my noble friend Lord Stevenson and others do—we need to treat that delegated legislation when it is brought forward in a way that is more intensive, consultative and engaging than our normal simple affirmative resolutions.
So I support the principle of the amendment of the noble Lord, Lord Arbuthnot, and the noble Baroness, Lady Neville-Rolfe. My Amendments 182A to 182C would simply add an additional dimension. As I read the amendment at the moment, it is emphatic on getting the Government to identify the impact on industry, charities and public bodies. The main point that we are all concerned about is actually the impact on individuals, the data subjects, yet they are not explicitly referred to in the draft of the amendment before us. My three amendments would therefore effectively do two things: first, they would require the Minister to consult data subjects or organisations representing them, such as consumer organisations, as well as those stipulated in the amendment as it stands; and, secondly, they would ensure that the impact assessments related to the impact on individuals as well as on organisations. I hope that the noble Lord would agree to my amendments at whatever point he and the noble Baroness propose to put this to the vote, in which case I could fully support their amendment.
My Amendment 22A is a specific example of the themes that my noble friend Lord Stevenson and the noble Baroness, Lady Jones, have already spelled out. I will not repeat everything they said but it is a particularly egregious form in that it allows the Minister—the noble Baroness, Lady Jones, has already referred to this—to add, vary or omit any safeguard that is in Schedule 1. I particularly object to “omit”. That does not simply mean modifying or tinkering in order to keep up with the technology; rather, it means omitting a serious safeguard that has been put in the Bill during its passage through Parliament.
Since Schedule 1 is pretty wide ranging, this could include issues that related to legal proceedings, crime, taxation, insurance, banking, immigration, public health or indeed any aspect of the public interest. That is a huge range of potential removal of safeguards that would not be subject to the approval of this House through primary legislation. If the safeguards persist and are maintained through the Bill when it eventually emerges, the ability of Ministers to vary them so drastically should be curtailed. I understand that my amendment would be pre-empted if my noble friend Lord Stevenson’s amendments were carried—but if they are not we definitely need to alter that clause.
This is a complex Bill because of the technology and because of the juxtaposition between European legislation and the position we are currently in with regard to it. The Bill is also an exemplar of what we are going to go through in Brexit-related legislation in a much wider sense. We must get right how we deal with delegated legislation post Brexit, and we need to ensure that the Bill is an example and does not concede powers to Henry VIII or indeed to the Minister that we might regret when his successors make use of them later.
My Lords, I can be very brief. I have not yet quite got through the concept of the Minister as Henry VIII. There is a clear common theme coming through every speech in the House today. The issue is whether the Government’s arguments for the use of the powers contained in the various clauses that have been mentioned—my amendments from these Benches, Amendments 24 and 107, relate to Clauses 9 and 15, but there is a broader issue—are credible and whether their desire for flexibility is convincing. As many noble Lords have mentioned, the Delegated Powers Committee did not find them particularly credible and stated:
“We regard this as an insufficient and unconvincing explanation for such an important power”.
That applies to Clause 15, but we on these Benches believe that the power in Clause 9 should not be there in its present form, either.
We have tried to be constructive. We have put forward a suggestion, as has the noble Lord, Lord Arbuthnot, for the use of the super-affirmative power. That is extremely well known and is enshrined in legislation—so, unlike the noble Lord, we did not feel the need to spell out exactly what the procedure was because it is already contained in a piece of legislation that I will no doubt come across in my notes at some suitable moment. It is now an extremely common and useful way of giving the Government flexibility, while allowing sufficient consultation before any regulations come to the House by affirmative resolution. We recognise that this could be fast moving, so it may be appropriate that the Government have those powers, provided that they are governed by super-affirmative resolution.
(7 years, 1 month ago)
Lords ChamberMy Lords, that is not an unexpected question. I can assure the noble Lord that we are not putting this into the long grass. He is absolutely right that there was a six-week evidence-gathering session. The evidence gathered has convinced us of the need to take action and reduce the maximum FOBT stakes. However, it is a complex issue and not about stakes alone. We are therefore publishing today a package of measures to address the concerns. We must strike the right balance between the socially responsible growth of the industry and the protection of consumers and the communities they live in. Our position is that the maximum stake should be between £50 and £2. We are consulting on that specific issue. This has to be done with due process to avoid any further problems which may come in the future with doing it in too rushed a manner.
My Lords, Liberal Democrats have been calling for a £2 stake on these highly addictive machines, which have been a catalyst of problem gambling, social breakdown and serious crime in communities, for nearly a decade. We therefore give a qualified welcome to the review, but, rather like the noble Lord, Lord Griffiths, we are disappointed that a range of options rather than a firm recommendation is being given, and that we now have a 12-week consultation rather than action. Reducing the maximum stake to £50 would still mean that you could lose £750 in five minutes, or £300 if the stake was reduced to £20. I urge the Minister and his colleagues to resist Treasury pressure and move to take effective action by focusing on stake reduction to £2, which would put a clear and sensible limit on all high street machines. Can the Minister tell us what the role of the Gambling Commission has been and will be in the consultation? It has a duty to minimise gambling-related harm and protect children and the vulnerable. Will the Government act on that advice? Will the review examine the proliferation of betting shops on the high street and the self-referral or exclusion system, which is so ineffective? As well as reducing the maximum stake, will it look at limiting the spin rate? Finally, will the consultation address stakes in online equivalents to these games, such as blackjack?
My Lords, the noble Lord makes a predictable comment about Treasury pressure, of which there was none. The decision on stakes will come from DCMS and not from the Treasury—although it will take into account fiscal implications, as it does for any government policy. The Gambling Commission is involved in the consultation because it is involved also in the other package of measures covered by it. The consultation is not just on the stakes but on other matters such as tougher licence conditions. The noble Lord referred to spin rates. What one can lose where higher stakes are concerned depends on the spin rate. I can confirm that that will be included in the consultation. I urge the noble Lord and the noble Lord, Lord Griffiths, to contribute to the consultation and make their views known.
(7 years, 1 month ago)
Lords ChamberMy Lords, I remind the Committee that this is an intensely practical issue. We have managed to lure many of our learned noble Lords from their chambers today—so clearly it has been a fairly expensive afternoon. I am only a humble solicitor and I tend to focus on what is practical and necessary for those whom we advise. The fundamental basis of these amendments is the concern in many sectors—manufacturing, retail, health, information technology and financial services in particular—that the free flow of data between ourselves and the EU continues post Brexit with minimum disruption. With an increasingly digital economy, this is critical for international trade.
We have been briefed by techUK, TheCityUK, the ABI, our own Lords EU affairs sub-committee, and the UK Information Commissioner herself. They have persuasively argued that we need to ensure that our data protection legislation is ruled as adequate for the purposes of permitting cross-border data flow into and out of the EU post Brexit. The first question that arises is: will the Government, even before any transition period, start the process needed to obtain an adequacy decision from the EU before we arrive at the status of a third country for EU data adequacy purposes?
However, as the Committee has heard today, if an adequacy ruling is to be sought, a major obstacle has been erected by the Government themselves in the European Union (Withdrawal) Bill, which makes it clear that the European Charter of Fundamental Rights will not become part of UK law as part of the replication process. Many noble Lords have spoken of their fears about the interaction with Article 8 of the charter, yet this article, relating to the protection of personal data, underpins the GDPR. How will we secure adequacy without adhering to the charter? Will the Government separately state that they will adhere to Article 8? We are not trying today to confer “special status”, in the words of the noble Lord, Lord Faulks, on Article 8. The wording of the amendment reflects Article 8, but it is designed to create certainty, post Brexit, for the sectors of business which I mentioned earlier.
Let us not forget that the EU Select Committee heard from witnesses who highlighted the ongoing role of the European Court of Justice and the continued relevance of the Charter of Fundamental Rights in relation to adequacy decisions. The amendment is not frivolous: it is essential to underpin an adequacy decision by the EU post Brexit. Does the House really want to put that decision at risk? I am sure that it does not. Whether now or in the future, we need to pass this kind of amendment. I look forward to hearing what the Minister has to say, which will determine whether or not the House divides.
My Lords, when I came into the Chamber, I had not the faintest intention of speaking in this debate. I do so, above all, for one reason: not because I am opposed to the amendment, although I am, very substantially, for the reasons given by the noble Lord, Lord Pannick. I do so because, in my experience, it is very unusual nowadays to vote at the outset of Committee stage on so fundamental a question as that raised by the amendment. It is surely yet more unusual—spectacularly so—to do so on a manuscript amendment filed this morning, which none of us has had sufficient time to deal with, on a very tricky area of the law, which so fundamentally alters the original amendment. As we have heard, that amendment was completely hopeless. The noble Lord, Lord Lester, described it as “constitutionally illiterate”. At least this one tries to introduce the concept of a balanced right which previously was missing.
It is true that I come from a different tradition where you do not vote on anything or decide anything unless you have heard the arguments. I rather gather that there may be a whipped vote on the other side, so the amendment is going to be voted on by noble Lords who have not heard the arguments of the noble Lords, Lord Pannick, Lord Faulks and Lord Lester, and who do not recognise the difficulties and the fundamental importance of this amendment. I seriously urge that it is not pressed to a Division today.
(7 years, 1 month ago)
Lords ChamberMy Lords, in moving Amendment 5, I will also speak to Amendment 6. Both are in my name. I will respond later to Amendment 115, which is in the same group but was tabled by other noble Lords. Amendments 5 and 6 are probing amendments to try to tease out what appears to be a change of definition between various parts of the Act.
Amendment 5 relates to page 3 and Clause 3(1), (2) and (3) in Chapter 1, which raise concerns about what exactly is happening with the arrangements. It is easier if I read out the two subsections concerned. Clause 3(2) states that:
“Chapter 2 of this Part … applies to the types of processing of personal data to which the GDPR applies by virtue of Article 2 of the GDPR”.
That is the question I want to peruse, because later in the Bill, on page 11, Clause 19(1)(a) refers to activities which operate. This amendment is a probing one to try to tease out an answer that we can read in Hansard so as to know what exactly we are talking about. It may appear to be a narrow difference or nitpicking, but “an activity” is a very broad term for anything in relation to data processing and contrasts with the narrow way in which Clause 3(2)(a) talks about “types of processing”. Are these the same? If they are not, what differentiates the two? If they are different, why have we got different parts in different areas of the Bill?
Amendment 6 relates to page 3, line 31. This question of definition has come up in relation to Chapter 3 of the part. I understand this to be more of a recital, if I may use that word, than a particular piece of statute and it may not have normative effect, if that is the correct terminology. Clause 3(3)(b) says that the part to which this applies,
“makes provision for a regime broadly equivalent to the GDPR to apply to such processing”.
What is “broadly” in this context? Maybe I am obsessed with the use of English words that have common meanings, but again it would be helpful to have a bit more information on the definition from the Minister when he responds.
Perhaps more than the “quite” used in response to an earlier amendment, this has not got transatlantic resonances, but it is important in questions of adequacy in any agreement we might seek with the EU in the future. “Broadly equivalent” carries echoes of an adequacy agreement, which would assert that the arrangements in the two countries concerned—the EU on the one hand and the third country on the other—were sufficiently equivalent to allow for future reliance on the processes in the third country to be treated as appropriate for the transfer of data into and from, in relation to future industrial processes.
We are aware that an element of legal decision-making arises, which might change that “broadly equivalent” to a higher bar of requirement in the sense that the court is beginning to think in terms of “essentially equivalent”, which is very different from “broadly equivalent”. Again, I would be grateful if the Minister could respond to that. I beg to move.
I will speak to Amendment 115 in this splendidly and creatively grouped set of amendments. The Government appear to have removed some of the extraterritorial elements in the GDPR in applying derogations in the Bill. Paragraph 9(d) of Schedule 6 removes all mention of “representative” from the Bill. This could have major consequences for data subjects.
Article 3 of the GDPR extends its provisions to the processing of personal data of data subjects in the European Union by a controller not established in the European Union. This happens when a controller is offering goods or services into the European Union. In such circumstances, article 27 requires a representative to be appointed in a member state, if a controller is not in the Union. This article is removed by paragraph 23 of Schedule 6.
Recital 80 of the GDPR explains the role of the representative:
“The representative should act on behalf of the controller or the processor and may be addressed by any supervisory authority … including cooperating with the competent supervisory authorities … to any action taken to ensure compliance with this Regulation. The designated representative should be subject to enforcement proceedings in the event of non-compliance by the controller or processor”.
Supposing that a company incorporated in the USA does not have a place of permanent establishment in the UK but still falls within article 3, such a company could be established in the USA and use its USA website to offer services to UK citizens without being caught by the Bill. Can the Minister reassure us that there is a solution to this problem?
My Lords, I am glad that the noble Lord, Lord Stevenson, has raised the question of the meaning of “broadly equivalent”. It encapsulates a difficulty I have found throughout the Bill: the language of the GDPR and of the law enforcement directive is more narrative and descriptive than language to which we are accustomed in UK legislation. Though one might say we should just apply a bit of common sense, that is not always the first thing to apply in interpreting UK legislation.
In this clause, there is another issue apart from the fact that “broadly equivalent” gives a lot of scope for variation. Although Clause 3 is an introduction to the part, if there are problems of interpretation later in Part 2, one might be tempted to go back to Clause 3 to find out what the part is about and be further misled or confused.
My Lords, I thank the Minister for that interesting exposition, which ranged from now into the future. He has given a vision of the post-Brexit shape of our data protection legislation. Extraterritoriality will apply even though the language used may be that of the applied GDPR as opposed to the GDPR itself—just to be confusing, perhaps as much as the Minister confused us.
I want to be absolutely clear that we are not derogating from the GDPR in extraterritoriality. That seems to be the nub of it. The Bill makes changes to the applied GDPR—I would like to read in Hansard exactly what the Minister said about the applied GDPR because I did not quite get the full logic of it—but there is no derogation in the GDPR on extraterritoriality. It would be helpful if he could be absolutely clear on that point.
Perhaps the Minister will respond to that because I, too, am troubled about the same point. If I am right, and I will read Hansard to make sure I am not misreading or mishearing what was said, the situation until such time as we leave through Brexit is covered by the GDPR. The extraterritorial—I cannot say it but you know what I am going to say—is still in place. Therefore, as suggested by the noble Lord, Lord Clement-Jones, a company operating out of a foreign country which was selling goods and services within the UK would have to have a representative, and that representative could be attached should there be a requirement to do so. It is strange that we are not doing that in the applied GDPR because, despite the great improvement that will come from better language, the issue is still the same. If there is someone that our laws cannot attack, there is obviously an issue. Perhaps the Minister would like to respond.
My Lords, I thank the noble Baroness for that accolade. I rise to speak to Amendment 170, which is a small contribution to perfecting Amendment 169. It struck me as rather strange that Amendment 152 has a reference to charities, but not Amendment 169. For charities, this is just as big an issue so I wanted to enlarge slightly on that. This is a huge change that is overtaking charities. How they are preparing for it and the issues that need to be addressed are of great concern to them. The Institute of Fundraising recently surveyed more than 300 charities of all sizes on how they are preparing for the GDPR, and used the results to identify a number of areas where it thought support was needed.
The majority of charities, especially the larger ones, are aware of the GDPR and are taking action to get ready for May 2018, but the survey also highlighted areas where charities need additional advice, guidance and support. Some 22% of the charities surveyed said that they have yet to do anything to prepare for the changes, and 95% of those yet to take any preparatory action are the smaller charities. Some 72% said that there was a lack of clear available guidance. Almost half the charities report that they do not feel they have the right level of skills or expertise on data protection, and 38% report that they have found limits in their administration or database systems, or the costs of upgrading these, a real challenge. That mirrors very much what small businesses are finding as well. Bodies such as the IoF have been working to increase the amount of support and guidance on offer. The IoF runs a number of events, but more support is needed.
A targeted intervention is needed to help charities as much as it is needed for small business. This needs to be supported by government—perhaps through a temporary extension of the existing subsidised fundraising skills training, including an additional training programme on how to comply with GDPR changes; or a targeted support scheme, directly funded or working with other funding bodies and foundations, to help the smallest charities most in need to upgrade their administrative or database systems. Charities welcome the recently announced telephone service from the ICO offering help on the GDPR, which they can access, but it is accessible only to organisations employing under 250 people and it is only a telephone service.
There are issues there, and I hope the Minister will be able to respond, in particular by recognising that charities are very much part of the infrastructure of smaller organisations that will certainly need support in complying with the GDPR.
My Lords, I broadly support what these interesting amendments are trying to do. I declare my interest as a member of the board of the Centre for Acceleration of Social Technology. Substantially, what it does is advise normally larger charities on how to best take advantage of digital to solve some of their problems.
Clearly, I support ensuring that small businesses, small charities and parish councils, as mentioned, are advised of the implications of this Act. If she has the opportunity, I ask the noble Baroness, Lady Neville-Rolfe, to explain why she chose staff size as the measure. I accept that hers is a probing amendment and she may think there are reasons not to go with staff size. The cliché is that when Instagram was sold to Facebook for $1 billion it had 13 members of staff. That would not come within the scope of the amendment, but there are plenty of digital businesses that can achieve an awful lot with very few staff. As it stands, my worry is this opens up a huge loophole.
My Lords, I will be brief on this group but I have two points to make. One is a question in respect of Amendment 51, where I congratulate the insurance industry on its lobbying. Within proposed new paragraph 15A(1)(b) it says,
“if … the controller has taken reasonable steps to obtain the data subject’s consent”.
Can the Minister clarify, or give some sense of, what “reasonable” means in this context? It would help us to understand whether that means an email, which might go into spam and not be read. Would there be a letter or a phone call to try to obtain consent? What could we as citizens reasonably expect insurance companies to do to get our consent?
Assuming that we do not have a stand part debate on Clause 4, how are the Government getting on with thinking about simplifying the language of the Bill? The noble Baroness, Lady Lane-Fox, is temporarily not in her place, but she made some good points at Second Reading about simplification. Clause 4 is quite confusing to read. It is possible to understand it once you have read it a few times, but subsection (2) says, for example, that,
“the reference to a term’s meaning in the GDPR is to its meaning in the GDPR read with any provision of Chapter 2 which modifies the term’s meaning for the purposes of the GDPR”.
That sort of sentence is quite difficult for most people to understand, and I will be interested to hear of the Government’s progress.
My Lords, I thank the noble Baroness for introducing these amendments in not too heavy a style, but this is an opportunity to ask a couple of questions in relation to them. We may have had since 20 October to digest them; nevertheless, that does not make them any more digestible. We will be able to see how they really operate only once they are incorporated into the Bill. Perhaps we might have a look at how they operate on Report.
The Bill is clearly a work in progress, and this is an extraordinary number of amendments even at this stage. It begs the question as to whether the Government are still engaged in discussions with outside bodies. Personally, I welcome that there has been dialogue with the insurance industry—a very important industry for us. We obviously have to make sure that the consumer is protected while it carries out an important part of its business. I know that the industry has raised other matters relating to third parties and so on. There have also been matters raised by those in the financial services industry who are keen to ensure that fraud is prevented. Even though they are private organisations, they are also keen to ensure that they are caught under the umbrella of the exemptions in the Bill. Can the noble Baroness tell us a little about what further discussions are taking place? It is important that we make sure that when the Bill finally hits the deck, so to speak, it is right for all the different sectors that will be subject to it.
My Lords, I thank my noble friend Lord Knight and the noble Lord, Lord Clement-Jones, for raising points that I would otherwise have made. I endorse the points they made. It is important that those points are picked up, and I look forward to having the responses.
I had picked up that the Clause 4(2) definition of terms is probably a recital rather than a normative issue, and therefore my noble friend Lord Knight’s point is probably not as worrying as it might otherwise have been. But like him, I found that it was tending towards the Alice in Wonderland side. Subsection (1) says:
“Terms used in Chapter 2 and in the GDPR have the same meaning in Chapter 2 as they have in the GDPR”.
I sort of get that, but it seems slightly unnecessary to say that, unless there is something that we are not picking up. I may be asking a negative: “There’s nothing in here that we ought to be alerted to, is there?”. I do not expect a response, but that is what we are left with at the end of this debate.
I have one substantial point relating to government Amendment 8. In the descriptions we had—this was taken from the letter—this is a technical amendment to ensure that there is clarity and that the definition of health professional in Clause 183 applies to Part 2 of the Bill. I do not think that many noble Lords will have followed this through, but it happens to pick up on a point which we will come back to on a later amendment: the question of certain responsibilities and exceptions applying to health professionals. There was therefore a concern in the back of my mind about how these would have been defined.
My point is that the definition that appears in the Bill, and which is signposted by the way that this amendment lies, points us to a list of professionals but does not go back into what those professionals do. I had understood from the context within which this part of the Bill is framed that the purpose of having health professionals in that position was that they were the people of whom it could be said that they had a duty of care to their patients. They could therefore by definition, and by the fact of the posts they occupied, have an additional responsibility attached to them through the nature of their qualifications and work. We are not getting that out of this government amendment. Can the Minister explain why polishing that amendment does or does not affect how that approach might be taken?
My Lords, I suspect that if you scratched half the Members of this House, they would have to declare an interest. I will just add a bit of non-Oxford variety as chair of the council of Queen Mary University of London. I express Front Bench support for my noble friend’s amendment and that of the noble Baroness, Lady Royall.
There is no doubt about the interaction of article 6 and the unfortunate inclusion of universities in the Freedom of Information Act definition, and there is no reason that I can see—we have heard about the alumni issues and the importance of fundraising to universities—why universities should not be put on all fours with charities, which can take advantage of the exemption in article 6. I very much hope that the Minister, who was nodding vigorously throughout most of the speeches, is prepared to state that he will come forward with an amendment, or accept this one, which would be gratefully received.
My Lords, perhaps I may say a word on behalf of the victims. I very much hope that we will be given the right to ask the college to cross our name off.
I very much enjoyed my time at Oxford. It took Oxford 37 years to cotton on to the idea that, having spent three years doing physics there, perhaps I was interested in physics and it might offer me something in continued involvement other than students being pestered into asking me for money twice a year. That is not a relationship; that is not a community; that is a one-way suck. It is a Dyson vacuum cleaner designed to hoover money in on the basis of creating some sort of obligation. It was a contract 40 years ago, for goodness’ sake: create something now or keep something going.
Fundamentally, I have very little sympathy with the idea—
My Lords, I shall also speak to Amendments 13, 15 and 21. It is slightly putting the cart before the horse to deal with Amendment 11. I will do so since it comes earlier in the order, but it covers a rather less general issue than the less general amendments.
Under the current Data Protection Act, controllers need a Schedule 2 legal basis to process personal data. Schedule 2 lists six main groupings and the controller has to select at least one from the list. If the controller does not have a legal basis for processing, then the controller cannot process the personal data. So it is surprising to discover that Clause 7, through the use of the word “includes”, can legitimise public sector processing of personal data on a ground not listed in the Bill. Such a basis might be, for instance, not necessary for the controller’s statutory functions, and that is why I seek the Minister’s reassurance.
There is all the difference between setting out the bases in an exhaustive way and a non-exhaustive way. In looking at how the position is reached, one needs to look at Clause 7, which states:
“In Article 6(1) of the GDPR (lawfulness of processing), the reference in point (e) to processing of personal data that is necessary for the performance of a task carried out in the public interest or in the exercise of the controller’s official authority includes processing of personal data that is necessary for … administration of justice”,
and so on until (d),
“the exercise of a function of the Crown, a Minister of the Crown or a government department”.
It can be seen by comparison with Schedule 2 of the DPA that the only missing basis for processing is,
“the exercise of any other functions of a public nature exercised in the public interest by any person”.
The Explanatory Notes to Clause 7 state:
“Article 6(2) of the GDPR enables Member States to, amongst other things, set out more specific provisions in respect of Article 6(1)(c) and (e). This clause provides a non-exhaustive list of examples of processing under Article 6(1)(e)”.
That seems slightly paradoxical; it says it is going to be more specific but the Explanatory Notes say it is going to be non-exhaustive. The note continues:
“This includes processing of personal data that is necessary for the administration of justice”,
and so on. The section on Clause 7 concludes:
“The list is similar to that contained in paragraph 5 of Schedule 2 to the 1998 Act”.
So the intent, as explained in paragraphs 85 and 86 of the Explanatory Notes, is for the Government to use the flexibility set out in Article 6(1)(c) and (e) to take an exhaustive list of legal bases for the processing of personal data and actually create a non-exhaustive list of grounds that public bodies can use in Clause 7. How paradoxical can you get?
My Lords, this is a rather unusual occasion, in that normally noble Lords say that they are going to read very carefully what the Minister has said in Hansard. In this case, I am certainly going to have to read carefully what the noble Lord, Lord Clement-Jones, said, in Hansard. This is a complicated matter and I thought that I was following it and then thought that I did not—and then I thought that I did again. I shall set out what I think should be the answer to his remarks, but when we have both read Hansard we may have to get together again before Report on this matter.
I am glad that we have this opportunity to set out the approach taken in the Bill to processing that is in the public interests and the substantial public interests. Both terms are not new; they appeared before 1998, as the noble Lord, Lord Stevenson, said, in the 1995 data protection directive, in the same sense as they are used in the GDPR and the Bill. That is to say, “substantial public interest” is one of the bases for the processing of special categories of personal data, and this is a stricter test than the public interest test that applies in connection with the processing of all categories of personal data. The noble Lord, Lord Clement-Jones, was wrong to suggest that the list provided in the 1998 Act in relation to public interest was genuinely exhaustive, I think. As he said himself, the effect of paragraph 5(d) of Schedule 2 was to make that list non-exhaustive.
In keeping with the approach taken under the 1998 Act, the Government have not limited the public interest general processing condition. The list in Clause 7 is therefore non-exhaustive. This is intentional, and enables organisations which undertake legitimate public interest tasks to continue to process general data. Noble Lords may recall that the Government committed after Second Reading to update the Explanatory Notes to provide reassurance that Clause 7 should be interpreted broadly. Universities, museums and many other organisations carrying out important work for the benefit of society all rely on this processing condition. For much the same reason, “public interest” has not historically been defined in statute, recognising that the public interest will change over time and according to the circumstances of each situation. This flexibility is important, and I would not wish to start down the slippery slope of attempting to define it further.
The Government have, however, chosen to set out in Part 2 of Schedule 1 an exhaustive list of types of processing which they consider constitute, or could constitute, processing in the substantial public interest. That reflects the increased risks for data subjects when their sensitive personal data is processed. Again, this approach replicates that taken in the 1998 Act. Where the Government consider that processing meeting a condition in that part will sometimes, but not necessarily, meet the substantial public interest test, a sub-condition to that effect is included. This ensures that the exemption remains targeted on those processing activities in the substantial public interest. A similar approach was taken in secondary legislation made under the 1998 Act. The Government intend to keep Part 2 of Schedule 1 under review, and have proposed a regulation-making power in Clause 9 that would allow Schedule 1 to be updated or refined in a timelier manner than would be the case if primary legislation were required. We will of course return to that issue in a later group.
Amendment 15 seeks to make clear that the public interest test referred to in Clause 7 is not restricted by the substantial public interest test referred to in Part 2 of Schedule 1. Having described the purposes of both these elements of the Bill, I hope that noble Lords can see that these are two separate tests. The different wording used would mean that these would be interpreted as different tests, and there is no need to amend the Bill to clarify that further.
Amendment 154 would require the Information Commissioner to develop a code of practice in relation to the processing of personal data in the public interest and substantial public interest. As we have already touched on, the Information Commissioner is developing relevant guidance to support the implementation of the new data protection framework. Should there later prove a need to formalise this guidance as a code of practice, Clause 124 provides the Secretary of State with the power to direct the Information Commissioner to make such a code. There is no need to make further provision.
I hope that that explanation satisfies noble Lords for tonight, and I urge the noble Lord to withdraw his amendment. However, in this complicated matter, I am certainly prepared to meet noble Lords to discuss this further, if they so require.
My Lords, I thank the Minister for that very helpful exposition. I shall return the compliment and read his contribution in Hansard with great care. I apologise to the noble Lord, Lord Kennedy, if the Bill has already had a befuddling influence on me. It comes from looking along the Labour Benches too much in profile.
With this amendment, I feel somewhat caught between the noble Lord, Lord Patel, and a very hard place. Clearly, he wants flexibility in a public interest test, and I can well understand that. But there are issues to which we shall need to return. The idea of a specific code seems the way forward; the way forward is not by granting overmighty powers to the Government to change the definitions according to the circumstances. I think that that was the phrase that the Minister used—they wish to have that flexibility so that the public interest test could be varied according to circumstances. If there is a power to change, it has to be pretty circumscribed. Obviously, we will come back to that in a later group. In the meantime, I beg leave to withdraw the amendment.
(7 years, 1 month ago)
Lords ChamberMy Lords, that provokes me to add something. I am not entirely clear whether we are talking about something that is too narrow within the GDPR, or whether it is a lack of a suitably wide derogation on the part of the Government as part of the Bill. For all the reasons that the two noble Lords have mentioned, it seems extraordinary that the beneficial activities that they are discussing are not included as exemptions, whether explicitly or implicitly. It may be that the Minister can give us greater comfort on that, but I am not clear what is giving rise to the problems. As we heard in earlier groupings, I am a fan of having something more explicit, if anything, in the Bill, which is particular perhaps to medical research and other forms of research in that sort of area. But it is not clear whether that is going to be permissible under the GDPR or whether the Government can actually derogate from it in those circumstances.
I shall respond to some of the points raised. First, on the research ethics committee, we established through legislation—and I remember the debates that we had—a national Research Ethics Committee to deal with all applications for biomedical research, but particularly research involving patient data and transfer of data. If I as a clinician want to do a trial, I have to apply to that committee with a full protocol as to what consent procedures and actual research there will be, and what will be the closing time of that consent. If I subsequently found the information that I had could lead to further research, or that the research that I had carried out had suddenly thrown up a next phase of research, I would have to go back to the committee and it would have to say, “Yes, that’s part of the original consent, which is satisfactory to progress with the further research”. It is a robust, nationally driven, independently chaired national ethics committee, apart from the local ethics committee that each trust will run. So the national ethics committee is the guardian.
Furthermore, there is a separate ethics committee for the 500,000 genomes project, run by the Wellcome Trust and other researchers; it is specifically for that project, for the consent issues that it obtains, the information given at the time when the subject gives the consent and how the data can be used in future. The genomes project aims to sequence all the 500,000 genomes, and to link that genome sequence data with the lifestyles that people had and diseases that they developed to identify the genes that we can subsequently use for future diagnosis and treatment—and to develop diagnostic tests that will provide early diagnosis of cancers, for instance. The future is in the diagnostic tests. Eventually we will find them for diseases which have not developed but which have a likelihood of developing. Those diagnostic tests will identify the early expression of a protein from a gene and then find a treatment to suppress that expression well before the diseases develop, rather than waiting until the cancer develops and then treating it.
All this is based on the data originally collected. At this stage, it is impossible to know where that research will lead—that is the history—apart from the clinical trials which are much more specific and you get consent for them. I realise that there is a limit to how much the text of the Bill can deviate from the GDPR, unless it is dealing with specific issues which the GDPR permits member states to provide derogations for. I realise that, post exit, the UK will need an adequacy agreement and some equivalent, neutral recognition of data protection regimes between the UK and the EU. We need that for the transfer of data. For instance, the noble Baroness, Lady Neville-Jones, has talked about extremely rare diseases, which require the exchange of data across many countries because their incidence is low and no one country could possibly have enough information on that group of patients.
The research exemption does not undermine agreement on Clause 7—which is what the noble Lord, Lord Clement-Jones, was leading up to when he asked about the ethics committee. The noble Baroness, Lady Neville-Rolfe, suggested that medical research should be possible through the research exemption, but that has to be wide enough yet not specific enough to encompass wider exemptions. I hope that the Minister will come up with that trick in an amendment which he might bring forward. It will not be restrictive, yet protect the patient’s personal interest.
There is a research exemption for processing specific categories of data, including health data. The legal basis for this is through article 9 of the GDPR, referred to in Part 1 of Schedule 1 to the Bill. However, all processing of personal data also needs an article 6 legal basis: research is not exempt from needing this. I am arguing today that research needs that exemption, defined in wide enough terms. For processing special categories, you need both an article 6 and an article 9 legal basis. We need to have provision for both in the Bill. One of the article 6 legal bases is consent and I have explained why this is not suitable for much research. The other feasible route for universities and other public bodies processing personal data for research is public interest. This is why it is so important to be clear on what processes can use this legal basis.
There was serious concern about the likely impact of the GDPR on research as it was being drafted. However, this was successfully resolved and it provides the necessary flexibility for the UK to create a data protection regime that is supportive of research in the public interest. The Government, and other UK organisations, worked hard to make sure that this was the case. The provision is there: it is now for the Government to act on it. It is also important to seek an adequacy agreement post Brexit: we will have to have one. It will be vital to consider the need to retain, post Brexit, cross-border transfers of data for research. I give the same example of rare diseases as the noble Baroness, Lady Neville-Jones, used. The Government have recognised the value of retaining a data protection regime consistent with the EU, but the research community would welcome knowing whether it will seek a status of adequacy as a third country or an equivalent agreement.
The plea I make is that unless we include a provision, and there are exemptions which can be written in the Bill in the format that is required, we will not be able to carry out much of the research. A question was asked about the life sciences industrial strategy. It is the key pillar of the Government’s industrial strategy Green Paper. It relies on data that the NHS collects and the data that the science community collects and marrying up the two to produce, and lead the world in, treatments and developing technologies. If we are not able to do this, the whole thing will be unworkable.
My Lords, the Minister gave the impression that medical research of the type described by the noble Lord, Lord Patel, was encompassed, or allowable, by the GDPR. Can he give chapter and verse on where in the mixture of article 6 and article 9 that occurs? That would be extremely helpful. I understand that obviously the Minister was also agreeing to look further in case those articles did not cover the situation, but it would be good to know which articles he is referring to.
I re-emphasise to the noble Lord that we think these tasks are in the public interest. However, I understand his desire for even more clarity than that. It would be sensible if I wrote to him and to other noble Lords taking part in the debate. I want to make sure that I get the legal basis right rather than just doing it on the hoof, so I agree to write to him and to all noble Lords who have spoken tonight. Again, as I say, we will work towards what I hope will be a more acceptable solution for everyone. Fundamentally, we do not want to impede medical research that is for the public good.