Data Protection Bill [HL]

Lord Clement-Jones Excerpts
Monday 13th November 2017

(7 years ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - -

My Lords, the noble Lord, Lord Stevenson, has raised some important points, which refer back to our labour over the Digital Economy Bill. One particular point occurs to me in relation to the questions that he asked: have we made any progress towards anonymisation in age verification, as we debated at some length during the passage of that Bill? As I recall, the Government’s point was that they did not think it necessary to include anything in the Bill because anonymisation would happen. The Minister should engage with that important issue. The other point that could be made is about whether the Government believe that the amendment of the noble Lord, Lord Lucas, would help us towards that goal.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, as we have heard, Part 3 of the Digital Economy Act 2017 requires online providers of pornographic material on a commercial basis to institute appropriate age verification controls. My noble friend’s Amendment 71ZA seeks to allow the age verification regulator to publish regulations relating to the protection of personal data processed for that purpose. The amendment aims to provide protection, choice and trust in respect of personal data processed for the purpose of compliance with Part 3 of the 2017 Act.

I think that I understand my noble friend’s aim. It is a concern I remember well from this House’s extensive deliberations on what became the Digital Economy Act, as referred to earlier. We now have before us a Bill for a new legal framework which is designed to ensure that protection, choice and trust are embedded in all data-processing practices, with stronger sanctions for malpractice. This partly answers my noble friend Lord Elton, who asked what we would produce to deal with this problem.

Personal data, particularly those concerning a data subject’s sex life or sexual orientation, as may be the case here, will be subject to rigorous new protections. For the reasons I have just mentioned, the Government do not consider it necessary to provide for separate standards relating exclusively and narrowly to age verification in the context of accessing online pornography. That is not to say that there will be a lack of guidance to firms subject to Part 3 of the 2017 Act on how best to implement their obligations. In particular, the age verification regulator is required to publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as compliant.

As noble Lords will be aware, the British Board of Film Classification is the intended age verification regulator. I reassure noble Lords that in its preparations for taking on the role of age verification regulator, the BBFC has indicated that it will ensure that the guidance it issues promotes the highest data protection standards. As part of this, it has held regular discussions with the Information Commissioner’s Office and it will flag up any potential data protection concerns to that office. It will then be for the Information Commissioner to determine whether action or further investigation is needed, as is her role.

The noble Lord, Lord Clement-Jones, talked about anonymisation and the noble Lord, Lord Stevenson, asked for an update of where we actually were. I remember the discussions on anonymisation, which is an important issue. I do not have the details of exactly where we have got to on that subject—so, if it is okay, I will write to the noble Lord on that.

I can update the noble Lord, Lord Stevenson, to a certain extent. As I just said, the BBFC is in discussion with the Information Commissioner’s Office to ensure that best practice is observed. Age verification controls are already in place in other areas of internet content access; for example, licensed gambling sites are required to have them in place. They are also in place for UK-based video-on-demand services. The BBFC will be able to learn from how these operate, to ensure that effective systems are created—but the age verification regulator will not be endorsing a list of age verification technology providers. Rather, the regulator will be responsible for setting guidance and standards on robust age verification checks.

We continue to work with the BBFC in its engagement with the industry to establish the best technological solutions, which must be compliant with data protection law. We are aware that such solutions exist, focusing rightly on verification rather than identification—which I think was the point made by the noble Lord, Lord Clement-Jones. If I can provide any more detail in the follow-up letter that I send after each day of Committee, I will do so—but that is the general background.

Online age verification is a rapidly growing area and there will be much innovation and development in this field. Industry is rightly putting data privacy and security at the forefront of its design, and this will be underscored by the new requirements under the GDPR. In view of that explanation, I hope that my noble friend will be able to withdraw his amendment.

--- Later in debate ---
Moved by
74: Clause 13, page 7, line 11, at end insert—
“( ) A decision is “based solely on automated processing” for the purposes of this section if, in relation to a data subject, there is no meaningful input by a natural person in the decision-making process.”
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, in moving Amendment 74, I will also speak to Amendments 74A, 75, 77, 119, 133A, 134 and 183—I think I have encompassed them all; at least I hope I have. In a way this is an extension of the very interesting debate that we heard on Amendment 71A, but further down the pipeline, so to speak. This group contains a range of possible and desirable changes to the Bill relating to artificial intelligence and the use of algorithms.

Data has been described, not wholly accurately, as the oil of artificial intelligence. With the advent of AI and its active application to datasets, it is vital that we strike the right balance in protecting privacy and the use of personal data. Indeed, the Minister spoke about that balance in that debate. Above all, we need to be increasingly aware of unintended discrimination where an element of a decision involves an algorithm. If a particular system learns from a dataset that contains biases, such as associating female names with family roles and male names with careers, it is likely to reproduce them in its decisions. One way of helping to identify and rectify bias is to ensure that such algorithms are transparent, so that it is possible to see not only what data is being used but the steps being taken to process that data in coming to a particular conclusion.

In all this, there is the major risk that we do not challenge computer-aided decision-making. To some extent, this is recognised by article 22 of the GDPR, which at least gives the right of explanation where there is fully automated decision-taking, and it is true that in certain respects, Clause 13 amplifies article 22. For instance, article 22 does not state what safeguards need to be in place; it talks just about proper safeguards. In the Bill, it is proposed that, after a decision has been made, the individual has to be informed of the outcome, which is better than what the GDPR currently offers. It also states that data subjects should have the right to ask that the decision be reconsidered or that the decision not be made by an algorithm. There is also the requirement, in certain circumstances, for companies and public bodies to undertake data protection impact assessment under Clause 62. There are also new provisions in the GDPR for codes of conduct and certification, so that if an industry is moving forward on artificial intelligence in an application, the ICO can certify the approach that the industry is taking on fairness in automated decision-taking.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.

The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.

The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.

The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.

I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,

“based solely on automated processing”,

to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.

In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I hesitate to interrupt the Minister, but it is written down in the recital that such a measure,

“should not concern a child”.

The whole of that recital is to do with automated processing, as it is called in the recital. The interpretation of that recital is going to be rather important.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I was coming to recital 71. In the example I gave, it seems odd to suggest that the NHS could at some future point use automated decision-making with appropriate safeguards to decide on the eligibility for a particular vaccine of an 82 year-old, but not a two year-old.

The noble Lord referred to the rather odd wording of recital 71. On this point, we agree with the Article 29 working party—the group of European regulators—that it should be read as discouraging as a matter of best practice automated decision-making with significant effects on children. However, as I have already said, there can and will be cases where it is appropriate, and the Bill rightly makes provision for those.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

Would the Minister like to give chapter and verse on how that distinction is made?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I think that “chapter and verse” implies “written”—and I will certainly do that because it is important to write to all noble Lords who have participated in this debate. As we have found in many of these areas, we need to get these things right. If I am to provide clarification, I will want to check—so I will take that back.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

I apologise for interrupting again. This is a bit like a dialogue, in a funny sort of way. If the Minister’s notes do not refer to the Article 29 working party, and whether or not we will continue to take guidance from it, could he include that in his letter as well?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I will. I had some inspiration from elsewhere on that very subject—but it was then withdrawn, so I will take up the offer to write on that. However, I take the noble Lord’s point.

We do not think that Amendment 75 would work. It seeks to prevent any decision being taken on the basis of automated decision-making where the decision would “engage” the rights of the data subject under the Human Rights Act. Arguably, such a provision would wholly negate the provisions in respect of automated decision-making as it would be possible to argue that any decision based on automated decision-making at the very least engaged the data subject’s right to have their private life respected under Article 8 of the European Convention on Human Rights, even if it was entirely lawful. All decisions relating to the processing of personal data engage an individual’s human rights, so it would not be appropriate to exclude automated decisions on this basis. The purpose of the Bill is to ensure that we reflect processing in the digital age—and that includes automated processing. This will often be a legitimate form of processing, but it is right that the Bill should recognise the additional sensitivities that surround it. There must be sufficient checks and balances and the Bill achieves this in Clauses 13 and 48 by ensuring appropriate notification requirements and the right to have a decision reassessed by non-automated means.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I rather hope that the Minister has not been able to persuade noble Lords opposite. Certainly, I have not felt myself persuaded. First, on the point about “solely”, in recruiting these days, when big companies need to reduce a couple of thousand applications to 100, the general practice is that you put everything into an automated process—you do not really know how it works—get a set of scores at the end and decide where the boundary lies according to how much time you have to interview people. Therefore, there is human intervention—of course there is. You are looking at the output and making the decision about who gets interviewed and who does not. That is a human decision, but it is based on the data coming out of the algorithm without understanding the algorithm. It is easy for an algorithm to be racist. I just googled “pictures of Europeans”. You get a page of black faces. Somewhere in the Google algorithm, a bit of compensation is going on. With a big algorithm like that, they have not checked what the result of that search would be, but it comes out that way. It has been equally possible to carry out searches, as at various times in the past, which were similarly off-beam with other groups in society.

When you compile an algorithm to work with applications, you start off, perhaps, by looking at, “Who succeeds in my company now? What are their characteristics?”. Then you go through and you say, “You are not allowed to look at whether the person is a man or a woman, or black or white”, but perhaps you are measuring other things that vary with those characteristics and which you have not noticed, or some combinations. An AI algorithm can be entirely unmappable. It is just a learning algorithm; there is no mental process that a human can track. It just learns from what is there. It says, “Give me a lot of data about your employees and how successful they are and I will find you people like that”.

At the end of the day, you need to be able to test these algorithms. The Minister may remember that I posed that challenge in a previous amendment to a previous Bill. I was told then that a report was coming out from the Royal Society that would look at how we should set about testing algorithms. I have not seen that report, but has the Minister seen it? Does he know when it is coming out or what lines of thinking the Royal Society is developing? We absolutely need something practical so that when I apply for a job and I think I have been hard done by, I have some way to do something about it. Somebody has to be able to test the algorithm. As a private individual, how do you get that done? How do you test a recruitment algorithm? Are you allowed to invent 100 fictitious characters to put through the system, or should the state take an interest in this and audit it?

We have made so much effort in my lifetime and we have got so much better at being equal—of course, we have a fair way to go—doing our best continually to make things better with regard to discrimination. It is therefore important that we do not allow ourselves to go backwards because we do not understand what is going on inside a computer. So absolutely, there has to be significant human involvement for it to be regarded as a human decision. Generally, where there is not, there has to be a way to get a human challenge—a proper human review—not just the response, “We are sure that the system worked right”. There has to be a way round which is not discriminatory, in which something is looked at to see whether it is working and whether it has gone right. We should not allow automation into bits of the system that affect the way we interact with each other in society. Therefore, it is important that we pursue this and I very much hope that noble Lords opposite will give us another chance to look at this area when we come to Report.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I thank all noble Lords who spoke in the debate. It has been wide-ranging but extremely interesting, as evidenced by the fact that at one point three members of the Artificial Intelligence Select Committee were speaking. That demonstrates that currently we live, eat and breathe artificial intelligence, algorithms and all matters related to them. It is a highly engaged committee. Of course, whatever I put forward from these Benches is not—yet—part of the recommendations of that committee, which, no doubt, will report in due course in March.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I highlight that we do not disagree with that. I will study carefully what my noble friend Lord Lucas said. We agree that it is important that privacy rights continue to be protected, and we do not expect data subjects to have their lives run by computer alone. That is exactly why the Bill creates safeguards: to make sure that individuals can request not to be the subject of decisions made automatically if it might have a significant legal effect on them. They are also allowed to demand that a human being participate meaningfully in those decisions that affect them. I will look at what my noble friend said and include that in my write-round. However, as I said, we do not disagree with that. The illusion that we have got to a stage where our lives will be run unaccountably by computers is exactly what the Bill is trying to prevent.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, I would not want to give that impression. None of us are gloom merchants in this respect. We want to be able to harness the new technology in a way that is appropriate and beneficial for us, and we do that by setting the right framework in data protection, ethical behaviour and so on.

I am grateful to the Minister for engaging in the way he has on the amendments. It is extremely important to probe each of those areas of Clauses 13, 47 and 48. For instance, there are lacunae. The Minister talked about the right to be informed and the right to challenge, and so on, and said that these provided adequate and proportional safeguards, but the right to explanation is not absolutely enshrined, even though it is mentioned in the GDPR. So in some areas we will probe on that.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, if it is mentioned in the GDPR, then it is there.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

Yes, my Lords, but it is in the recital, so I think we come back again to whether the recitals form part of the Bill. That is what I believe to be the case. I may have to write to the Minister. Who knows? Anything is possible.

One of the key points—raised by the noble Lord, Lord Lucas—is the question of human intervention being meaningful. To me, “solely”, in the ordinary meaning of the word, does not mean that human intervention is there at all, and that is a real worry. The writ of the article 29 working group may run until Brexit but, frankly, after Brexit we will not be part of the article 29 working group, so what interpretation of the GDPR will we have when it is incorporated into UK domestic law? If those rights are not to be granted, the interpretation of “solely” with the absolute requirement of human involvement needs to be on the face of the Bill.

As far as recital 71 is concerned, I think that the Minister will write with his interpretation and about the impact of the article 29 working group and whether we incorporate its views. If the Government are not prepared to accept that the rulings of the European Court of Justice will be effective in UK law after Brexit, I can only assume that the article 29 working group will have no more impact. Therefore, there is a real issue there.

I take the Minister’s point about safeguards under the Equality Act. That is important and there are other aspects that we will no doubt wish to look at very carefully. I was not overly convinced by his answer to Amendment 75, spoken to by the noble Baroness, Lady Jones, and my noble friend Lady Hamwee, because he said, “Well, it’s all there anyway”. I do not think we would have had to incorporate those words unless we felt there was a gap in the way the clause operated.

I will not take the arguments any further but I am not quite as optimistic as the Minister about the impact of that part of the Bill, and we may well come back to various forms of this subject on Report. However, it would be helpful if the Minister indicated the guidance the ICO is adopting in respect of the issue raised in Amendment 153A. When he writes, perhaps he could direct us to those aspects of the guidance that will be applicable in order to help us decide whether to come back to Amendment 153A. In the meantime, I beg leave to withdraw.

Amendment 74 withdrawn.
--- Later in debate ---
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, it always used to be said that reaching the end of your Lordships’ day was the graveyard slot. This is a bit of a vice slot. You are tempted by the growing number of people coming in to do a bit of grandstanding and to tell them what they are missing in this wonderful Bill that we are discussing. You are also conscious that the dinner hour approaches—and I blame the noble Baroness, Lady Hamwee, for that. All her talk of dining in L’Algorithme, where she almost certainly had a soup, a main course and a pudding, means that it is almost impossible to concentrate for the six minutes that we will be allowed—with perhaps a few minutes more if we can be indulged—to finish this very important group. It has only one amendment in it. If noble Lords did not know that, I bet that has cheered them up. I am happy to say that it is also a réchauffage, because we have already discussed most of the main issues, so I will be very brief in moving it.

It is quite clear from our discussion on the previous group that we need an ethics body to look at the issues that we were talking about either explicitly or implicitly in our debates on the previous three or four groups and to look also at moral and other issues relating to the work on data, data protection, automatics and robotics, and everything else that is going forward in this exciting field. The proposal in Amendment 78A comes with a terrific pedigree. It has been brought together by members of the Royal Society, the British Academy, the Royal Statistical Society and the Nuffield Trust. It is therefore untouchable in terms of its aspirations and its attempt to get to the heart of what should be in the contextual area around the new Bill.

I shall not go through the various points that we made in relation to people’s fears, but the key issue is trust. As I said on the previous group, if there is no trust in what is set up under the Bill, there will not be a buy-in by the general public. People will be concerned about it. The computer will be blamed for ills that are not down to it, in much the same way that earlier generations always blamed issues external to themselves for the way that their lives were being lived. Shakespeare’s Globe was built outside the city walls because it was felt that the terribly dangerous plays that were being put on there would upset the lieges. It is why penny dreadfuls were banned in the early part of the last century and why we had a fight about video nasties. It is that sort of approach and mentality that we want to get round to.

There is good—substantial good—to be found in the work on automation and robotics that we are now seeing. We want to protect that but in the Bill we are missing a place and a space within which the big issues of the day can be looked at. Some of the issues that we have already talked about could easily fit with the idea of an independent data ethics advisory board to monitor further technical advances in the use and management of personal data and the implications of that. I recommend this proposal to the Committee and beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

My Lords, the noble Lord, Lord Stevenson, has been admirably brief in the pre-dinner minutes before us and I will be brief as well. This is a very important aspect of the debate and, despite the fact that we will be taking only a few minutes over it, I hope that we will return to it at a future date.

I note that the Conservative manifesto talked about a data ethics body, and this is not that far away from that concept. I think that the political world is coalescing around the idea of an ethics stewardship body of the kind recommended by the Royal Society and the British Academy. Whatever we call it—a rose by any other name—it will be of huge importance for the future, perhaps not as a regulator but certainly as a setter of principles and of an ethical context in which AI in particular moves forward.

The only sad thing about having to speed up the process today is that I am not able to take full advantage of the briefing put forward by the Royal Society. Crucially, it recommends two things. The first is:

“A set of high-level principles to help visibly shape all forms of data governance and ensure trustworthiness and trust in the management and use of data as a whole”.


The second is:

“A body to steward the evolution of the governance landscape as a whole. Such a stewardship body would be expected to conduct expert investigation into novel questions and issues, and enable new ways to anticipate the future consequences of today’s decisions”.


This is an idea whose time has come and I congratulate the noble Lords, Lord Stevenson and Lord Kennedy, on having tabled the amendment. I certainly think that this is the way forward.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Stevenson, has raised the important issue of data ethics. I am grateful to everyone who has spoken on this issue tonight and has agreed that it is very important. I assure noble Lords that we agree with that. We had a debate the other day on this issue and I am sure we will have many more in the future. The noble Lord, Lord Puttnam, has been to see me to talk about this, and I tried to convince him then that we were taking it seriously. By the sound of it, I am not sure that I completely succeeded, but we are. We understand the points he makes, although I am possibly not as gloomy about things as he is.

We are fortunate in the UK to have the widely respected Information Commissioner to provide expert advice on data protection issues—I accept that that advice is just on data protection issues—but we recognise the need for further credible and expert advice on the broader issue of the ethical use of data. That is exactly why we committed to setting up an expert advisory data ethics body in the 2017 manifesto, which, I am glad to hear, the noble Lord, Lord Clement-Jones, read carefully.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - -

We like to hold the Government to their manifesto commitments occasionally.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Tonight the noble Lord can because the Secretary of State is leading on this important matter. She is as committed as I am to ensuring that such a body is set up shortly. She has been consulting widely with civil society groups, industry and academia, some of which has been mentioned tonight, to refine the scope and functions of the body. It will work closely with the Information Commissioner and other regulators. As the noble Lords, Lord Clement-Jones and Lord Patel, mentioned, it will identify gaps in the regulatory landscape and provide Ministers with advice on addressing those gaps.

It is important that the new advisory body has a clearly defined role and a strong relationship to other bodies in this space, including the Information Commissioner. The Government’s proposals are for an advisory body which may have a broader remit than that suggested in the amendment. It will provide recommendations on the ethics of data use in gaps in the regulatory landscape, as I have just said. For example, one fruitful area could be the ethics of exploiting aggregated anonymised datasets for social and commercial benefit, taking into account the importance of transparency and accountability. These aggregated datasets do not fall under the legal definition of personal data and would therefore be outside the scope of both the body proposed by the noble Lord and, I suspect, this Bill.

Technically, Amendment 78 needs to be more carefully drafted to avoid the risk of non-compliance with the GDPR and avoid conflict with the Information Commissioner. Article 51 of the GDPR requires each member state to appoint one or more independent public authorities to monitor and enforce the GDPR on its territory as a supervisory authority. Clause 113 makes the Information Commissioner the UK’s sole supervisory authority for data protection. The functions of any advisory data ethics body must not cut across the Information Commissioner’s performance of its functions under the GDPR.

The amendment proposes that the advisory board should,

“monitor further technical advances in the use and management of personal data”.

But one of the Information Commissioner’s key functions is to

“keep abreast of evolving technology”.

That is a potential conflict we must avoid. The noble Lord, Lord Patel, alluded to some of the conflicts.

Nevertheless, I agree with the importance that noble Lords place on the consideration of the ethics of data use, and I repeat that the Government are determined to make progress in this area. However, as I explained, I cannot agree to Amendment 78 tonight. Therefore, in the light of my explanation, I hope the noble Lord will feel able to withdraw it.