Monday 13th November 2017

(6 years, 7 months ago)

Lords Chamber
Read Hansard Text Read Debate Ministerial Extracts
Committee (3rd Day) (Continued)
17:43
Amendment 63A
Moved by
63A: Schedule 1, page 121, line 11, at end insert—
“Legal proceedings, legal advice, legal rights and judicial acts
(1) This condition is met if the processing—(a) is necessary for the purpose of, or in connection with, any legal proceedings (including prospective legal proceedings),(b) is necessary for the purpose of obtaining legal advice, or(c) is otherwise necessary for the purposes of establishing, exercising or defending legal rights.(2) This condition is met if the processing is necessary when a court is acting in its judicial capacity.”
Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, these amendments, in my name and those of the noble Baroness, Lady Neville-Rolfe, and the noble Lord, Lord Arbuthnot, may not be the most difficult or most significant that we will come to, but they are important and they deal with an issue brought to us by the Bar Council. I am aware that members of the Bar Council met officials and I believe that some of the matters throughout the Bill that they discussed were left with officials to consider—and, no doubt, with the Bar Council as well. I am not aware that this matter has been settled. The amendment would remove the paragraph from Part 3 of this schedule and put it in Part 2 and would extend the exemption recognising practicalities. Briefly, the issue is the term “legal claims”.

The Bar Council makes the point that this phrase does not adequately describe all the work that lawyers and all parts of the profession undertake on behalf of their clients. There is a risk, therefore, that legal professionals will not be able to process special categories of personal data when undertaking legal advice relating to prosecutions, defences to prosecutions and criminal appeals, family and child protection proceedings and so on, or—noble Lords may think that this should not come within this category—legal advice relating to tax or a proposed transaction. The Bar Council is rightly concerned, of course, to ensure that legal professionals can process such data when undertaking activity which is squarely within the scope of its normal work but beyond what might be described by the narrow term, “legal claims”. The amendment includes wording which is about to be put to the Committee in the form of government amendments which have already been debated and brings the matter of the legal activity listed in the new clause and the government amendments into Part 2 of Schedule 1. I beg to move.

Lord Griffiths of Burry Port Portrait Lord Griffiths of Burry Port (Lab)
- Hansard - - - Excerpts

My Lords, if the House will indulge me, having heard someone who described herself earlier as a foot soldier in her army of volunteers, I can now identify her as a beaver in the battalion of dam building. It seems that by broadening all that falls under the term, “legal claims”, and, of course, on the advice of the Bar Council, some common sense is being alluded to here and therefore we have no hesitation in joining our forces to those we have heard so ably expressed.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Baroness for making her debut in the Committee stage and to the noble Lord for his comments. By way of background, because I find it quite complicated, it is worth reminding ourselves that article 9 of the GDPR provides processing conditions for special categories of data. In particular, the processing necessary for,

“the establishment, exercise or defence of legal claims”,

is permitted by article 9(2)(f). It is directly applicable and does not allow any discretion to derogate from it in any way. Article 10 of the GDPR, which relates to criminal convictions and offences data, takes a different approach. It requires member states to set out in their law conditions relating to the processing of said criminal convictions and offences data in order to enable many organisations to process it. Paragraph 26 of Schedule 1 therefore seeks to maintain the status quo by replicating in relation to criminal convictions data the processing condition for the special categories of personal data contained in article 9(2)(f).

Government Amendment 65, referred to by the noble Baroness, responds to a request we have had from stakeholders to anglicise the language currently used in that paragraph. The Government strongly agree about the importance of ensuring that data protection law does not accidentally undermine the proper conduct of legal proceedings, which is why we have made this provision. We submit that Amendments 63A and 64A are unnecessary. They are predicated on the false premise that government Amendment 65 in some way changes the scope of paragraph 26. It does not, it simply anglicises it. However, even if different wording were to be used in Amendment 63A to that used in Amendment 65, we are certain that the Commission would take a dim view of member states attempting to use article 9(2)(g), the substantial public interest processing condition, to expand article 9(2)(f) in the way that Amendment 63A proposes. In the light of that explanation, I would be grateful if in this case the noble Baroness would withdraw her amendment.

Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

My Lords, I am still processing the compliment that has been paid to me. If I were standing for election, the noble Lord might find himself being quoted.

The Minister says that the amendment is unnecessary but then goes on to say that it is wrong. The main point is not the five or so lines of wording as what is required or precluded by the articles of the GDPR that he has quoted. I will not attempt to respond today because I could not do his arguments justice, but I suspect that others will try to do so. As I say, his officials have met with representatives of the Bar Council. I am sure that he will be happy for that dialogue to continue, and if necessary for it to extend to some of us who might come along and listen to what the officials are saying and give it a rubber stamp in an effort to progress the argument. There is a real concern about where this exemption should lie and how it should apply, so I will beg leave to withdraw the amendment, not because I am convinced but because there is still more discussion to be had.

Amendment 63A withdrawn.
Amendments 64 and 64A not moved.
Amendments 65 and 66
Moved by
65: Schedule 1, page 121, line 36, leave out from “processing” to end of line 38 and insert “—
(a) is necessary for the purpose of, or in connection with, any legal proceedings (including prospective legal proceedings),(b) is necessary for the purpose of obtaining legal advice, or(c) is otherwise necessary for the purposes of establishing, exercising or defending legal rights.”
66: Schedule 1, page 121, line 38, at end insert—
“26A_ This condition is met if the processing is necessary when a court is acting in its judicial capacity.”
Amendments 65 and 66 agreed.
Amendments 66A to 68 not moved.
Schedule 1, as amended, agreed.
Clause 10: Special categories of personal data etc: supplementary
Amendment 69
Moved by
69: Clause 10, page 6, line 12, leave out “supervision” and insert “responsibility”
Amendment 69 agreed.
Amendment 70 not moved.
Amendment 71
Moved by
71: Clause 10, page 6, line 16, leave out “this section” and insert “section 9”
Amendment 71 agreed.
Clause 10, as amended, agreed.
Amendment 71ZA
Moved by
71ZA: After Clause 10, insert the following new Clause—
“Regulations relating to the processing of personal data under Part 3 of the Digital Economy Act 2017
(1) Subject to the following provisions of this section, the age-verification regulator under section 16 of the Digital Economy Act 2017 may publish, and revise from time to time, regulations relating to the processing of personal data for purposes of age verification under types of arrangements for making pornographic material available not prohibited by section 14 of the Digital Economy Act 2017 in order to—(a) provide appropriate protection, choice and trust in respect of personal data processed as part of any such arrangements; and(b) create any technical obligations necessary to achieve the aims set out in subsection (1)(a).(2) Once the regulator has prepared a draft of regulations it proposes to publish under subsection (1), it must submit the draft to the Secretary of State.(3) When draft regulations are submitted to the Secretary of State under subsection (2), the Secretary of State must lay those draft regulations before both Houses of Parliament.(4) If, within the period of 40 days beginning with the day on which draft regulations are laid before Parliament under subsection (3), either House resolves not to approve those draft regulations, the age-verification regulator must not publish those regulations in the form of that draft.(5) If no such resolution is made within that period, the age-verification regulator must publish the regulations in the form of the draft laid before Parliament.(6) But subsection (8) applies, instead of subsections (4) and (5), in a case falling within subsection (7).(7) The cases falling within this subsection are those where draft regulations are laid before Parliament under subsection (3) and no previous regulations have been published under subsection (1) by the age-verification regulator.(8) The regulator must not publish regulations in the form of the draft laid before Parliament unless the draft has been approved by a resolution of each House of Parliament.(9) Subsection (4) does not prevent new draft regulations from being laid before Parliament. (10) For the purposes of subsection (4)—(a) where draft regulations are laid before each House of Parliament on different days, the later day is to be taken as the day on which it was laid before both Houses, and(b) in reckoning any period of 40 days, no account is to be taken of any time during which Parliament is dissolved or prorogued or during which both Houses are adjourned for more than 4 days.(11) References in this section to regulations and draft regulations include references to revised regulations and draft revised regulations.”
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I thank the Open Rights Group for pushing for this amendment, and particularly the Public Bill Office for getting it into a form that is acceptable in the Bill. This amendment addresses age verification for accessing pornography; currently there are no specific safeguards. However, sexual preferences are very sensitive, so this amendment allows—it does not compel—regulation at a higher level than is currently the case. The pornography industry has a woeful record of regular, large-scale breaches of data security and I do not believe that we should trust it. Even if we think we might trust the industry, we ought to be in a position where we do not have to. Our young people deserve proper protection regarding some very sensitive data.

I believe that we should take this seriously—my experience of young boys of 14 and 15 is that they are being exposed to high-grade pornography on a large scale, something that in the context of their relationships with women later in life we may want to think about carefully. Therefore, surely we should take the opportunity to give ourselves the powers to take action, should we decide that that is necessary, rather than having to come back to primary legislation with all the time and delay that that involves. We can anticipate this difficulty—we can see it coming down the tracks—so let us prepare for it. I beg to move.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara (Lab)
- Hansard - - - Excerpts

My Lords, I am completely discombobulated because the noble Lord, Lord Lucas, has hidden himself on the far right-hand side of the Chamber, which makes it very difficult to engage with him—but I am sure we can get over it. He is also incredibly skilful to have got an amendment of this type into the Bill, because we were looking at this issue as well but could not find a way through. I would like a tutorial with him afterwards about how to get inside the interstices of this rather complicated legislative framework.

I must say that I have read his amendment several times and still cannot quite get it. I shall therefore use my usual strategy, which is to come in from an aerial height on a rarefied intellectual plane and ask the Minister to sum up in a way that I can understand—but under the radar I will ask for three things. First, we spent a lot of time on this in the Digital Economy Act. It is an important area and it is therefore important that we get it right. It would be quite helpful to the Committee, and would inform us for the future, if we could have a statement from the Dispatch Box or a letter saying where we have got to on age verification.

I hear rumours that the system envisaged at the time when the Digital Economy Act was going through has not been successful in practice. I think that we have heard from the Minister and others in earlier groups in relation to similar topics that in practice the envisaged age verification system is not being implemented as it stands. What is happening is that the process of trying to clear up this area and making sure that age verification is in place is actually being carried out on a voluntary basis by those who run credit cards and banking services for the companies involved and for whom a simple letter from the regulator, in this case the BBFC, is sufficient to cause them to cease to process any moneys to the sites concerned—and, as a result, that is what is happening in the pornography industry. That may or may not be a good thing—it is probably too early to say—but it was not the intention of the Bill. That was to have a system that was dependent on a proper age verification system and to make the process open and transparent. If it is different, we ought to know that before we start considering these areas.

My third point is that we would rely on Ministers to let us know whether it is necessary to return to this issue in the sense of the information that we hope will be provided. It is only at that level that we can respond carefully to what the noble Lord said—although I have no doubt that it is a very important area.

Lord Elton Portrait Lord Elton (Con)
- Hansard - - - Excerpts

My Lords, perhaps I may intervene between the two Front Benches. I wish to ask my noble friend on the Front Bench not to say—should he be tempted to—that this simply will not work, even if he explains why in great detail, but to say whether what the amendment tries to do is worth doing and, if so, how it can be achieved.

18:00
Lord Clement-Jones Portrait Lord Clement-Jones (LD)
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Stevenson, has raised some important points, which refer back to our labour over the Digital Economy Bill. One particular point occurs to me in relation to the questions that he asked: have we made any progress towards anonymisation in age verification, as we debated at some length during the passage of that Bill? As I recall, the Government’s point was that they did not think it necessary to include anything in the Bill because anonymisation would happen. The Minister should engage with that important issue. The other point that could be made is about whether the Government believe that the amendment of the noble Lord, Lord Lucas, would help us towards that goal.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, as we have heard, Part 3 of the Digital Economy Act 2017 requires online providers of pornographic material on a commercial basis to institute appropriate age verification controls. My noble friend’s Amendment 71ZA seeks to allow the age verification regulator to publish regulations relating to the protection of personal data processed for that purpose. The amendment aims to provide protection, choice and trust in respect of personal data processed for the purpose of compliance with Part 3 of the 2017 Act.

I think that I understand my noble friend’s aim. It is a concern I remember well from this House’s extensive deliberations on what became the Digital Economy Act, as referred to earlier. We now have before us a Bill for a new legal framework which is designed to ensure that protection, choice and trust are embedded in all data-processing practices, with stronger sanctions for malpractice. This partly answers my noble friend Lord Elton, who asked what we would produce to deal with this problem.

Personal data, particularly those concerning a data subject’s sex life or sexual orientation, as may be the case here, will be subject to rigorous new protections. For the reasons I have just mentioned, the Government do not consider it necessary to provide for separate standards relating exclusively and narrowly to age verification in the context of accessing online pornography. That is not to say that there will be a lack of guidance to firms subject to Part 3 of the 2017 Act on how best to implement their obligations. In particular, the age verification regulator is required to publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as compliant.

As noble Lords will be aware, the British Board of Film Classification is the intended age verification regulator. I reassure noble Lords that in its preparations for taking on the role of age verification regulator, the BBFC has indicated that it will ensure that the guidance it issues promotes the highest data protection standards. As part of this, it has held regular discussions with the Information Commissioner’s Office and it will flag up any potential data protection concerns to that office. It will then be for the Information Commissioner to determine whether action or further investigation is needed, as is her role.

The noble Lord, Lord Clement-Jones, talked about anonymisation and the noble Lord, Lord Stevenson, asked for an update of where we actually were. I remember the discussions on anonymisation, which is an important issue. I do not have the details of exactly where we have got to on that subject—so, if it is okay, I will write to the noble Lord on that.

I can update the noble Lord, Lord Stevenson, to a certain extent. As I just said, the BBFC is in discussion with the Information Commissioner’s Office to ensure that best practice is observed. Age verification controls are already in place in other areas of internet content access; for example, licensed gambling sites are required to have them in place. They are also in place for UK-based video-on-demand services. The BBFC will be able to learn from how these operate, to ensure that effective systems are created—but the age verification regulator will not be endorsing a list of age verification technology providers. Rather, the regulator will be responsible for setting guidance and standards on robust age verification checks.

We continue to work with the BBFC in its engagement with the industry to establish the best technological solutions, which must be compliant with data protection law. We are aware that such solutions exist, focusing rightly on verification rather than identification—which I think was the point made by the noble Lord, Lord Clement-Jones. If I can provide any more detail in the follow-up letter that I send after each day of Committee, I will do so—but that is the general background.

Online age verification is a rapidly growing area and there will be much innovation and development in this field. Industry is rightly putting data privacy and security at the forefront of its design, and this will be underscored by the new requirements under the GDPR. In view of that explanation, I hope that my noble friend will be able to withdraw his amendment.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I am very grateful for my noble friend’s reply. With his leave, I will digest it overnight and tomorrow. I look forward to the letter that he promised—but if, at the end of that, I still think that there is something worth discussing, I hope that his ever-open door will be open even to that.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I believe that during our previous day in Committee, I offered to meet my noble friend.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

I am very grateful and I beg leave to withdraw the amendment.

Amendment 71ZA withdrawn.
Clause 11 agreed.
Amendment 71A
Moved by
71A: After Clause 11, insert the following new Clause—
“Right to be informed of the commercial exploitation of personal data
(1) Data controllers must notify data subjects of all intended or actual commercial exploitation of their personal data.(2) The notification under subsection (1) must be made—(a) at the time when the data subject consents to their personal data being processed by the data controller,(b) before commercial exploitation takes place, if this is more than six months after the notification in paragraph (a), and(c) every six months thereafter if the commercial exploitation is ongoing.(3) Notifications under this section must include—(a) the primary uses to which the personal data will be put, and(b) the gross revenues the data controller expects to receive through the exploitation of that personal data.”
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, I was not referring to this amendment specifically in commenting on Amendment 71ZA, but we had difficulty getting this amendment in scope, so as to be in line with our aspirations and what we wanted to discuss today.

Amendment 71A would introduce an individual right for data subjects to be informed by data controllers when there is an actual or intended commercial exploitation of their personal data. Machine learning will allow data companies to get a lot of value out of people’s data—indeed, it already does. It will allow greater and more valuable targeting of advertisements and services on a vast scale, given the way that modern data platforms work. This skews further the balance of power between those companies and the individuals whose data is being exploited.

One could probably describe the current relationship between people and the data companies to whom they give their data as rather unsophisticated. People hand it over for a very low value, as in a bartering service or crude exchange—and, as in a barter economy, it cannot be efficient. This amendment will test whether we can get more power into the hands of the people who make the exchange to make the market function better. The companies’ position is completely the reverse: it is almost that of a monopsony, although as a technical term monopsonies are those situations in which dominant companies set a price for the market, whereas in this case there is no price. It is interesting to follow that line of thought a little further because, where there are monopsonies, the normal remedy put forward by those involved is to publish a standard price list. That improves choice to the point that people are not exploited on the price they pay; it is just a question of choice on quality or service, rather than the price. That at least protects individuals to some extent against the dominant company exploiting control.

The essence of this amendment is an attempt to try to give power back to the people whose data is being used. We are talking about very significant sums of money. I gather from a recent article in the Guardian that the top price you can get for your data—although I am not sure whether “price” is the right word here; “value” might be better—is about $14 each quarter for a company such as Facebook. If you compare that across the world, in the Asia-Pacific region it is worth only about $2. There is a variation, and the reason is the ability to exploit some form of advertising revenue from individual data, so the US, where the highest prices are going to be available, was worth about $2.8 billion in advertising revenue to Facebook last quarter while the second-biggest Facebook market, Europe, was worth only about £$1.4 billion, which is about half. You can see how the prices would follow through in terms of the data. We are talking about quite a lot of resource here in terms of how this money flows and how it works.

The process of trying to seek the money has already started. Some companies are now trying to reverse the direction of travel. They go to individuals through the web and offer them the chance to connect all their data together across the social media companies in which they already have it. The companies then value it and try to sell it on behalf of the individuals to the companies concerned. That is obviously the beginning of a market approach to this, which is where this amendment is centred.

I mentioned that I had difficulty getting what I wanted in the scope of the Bill. I think I have mentioned this before, but it seems to us that we do not yet have the right sense of what people’s data represent in relation to the companies that seek to use it. One suggestion we have had is that we might look to the creative industries—not inappropriately since this is a DCMS Bill—and think of it as some form of copyright. If it were a copyright—and it may or may not be possible to establish one’s personal data in a copyright mode—we would immediately be in a world where the data transferring from the individual to the company would be not sold but licensed, and therefore there would be a continuing sense of ownership in the process in which the data is transferred. It would also mean that there would have to be continuing reporting back to the licence holder for the use of the data, and we could go further and expect to follow the creative industries down the track which they currently go. The personal copyright would then have value to the company and there is a waterfall, as they call it, of revenue exploitation so that those who hold the copyright might expect to earn a small but not insignificant amount from it. We begin to see a commercial system, more obviously found in other areas of the marketplace, but it relates to the way in which individuals would have a value in relation to their data, and there might even be a way in which that money could be returned. If you were in that happy situation, what would you do with the money? One would hope that it would be useful to some people, but it might also be possible to accumulate it, perhaps through a collecting society, and see it invested in educational work or improving people’s security in relation to their data, for instance. There are many choices around that.

Having said all that about copyright, I am not particularly wedded to it as a concept because there are downsides to copyright, but it is an issue worth exploring. The essence of the amendment is to try to restore equality of arms between the individual and the companies to which the data is transferred. I beg to move.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I am grateful to the noble Lord, Lord Stevenson, for raising this important subject. I recall the questions that he posed at Second Reading about whether data subjects had sufficient support in relation to the power of companies that wanted to access, use and monetise their data, and I recognise the intention behind his amendment, which he carefully explained. I also agree wholeheartedly with him that these are questions worthy of debate, not only during the passage of this Bill, but over the coming months and years as the digital economy continues to develop. Later in Committee, we may discuss suitable forums where this could take place. These are big questions of data rights and how they are monetised, if they are, versus the growth of the digital economy for public benefit.

18:15
Through the evolution of the GDPR, we attempted to wrestle with these questions and to reach an appropriate balance between protecting the rights of data subjects and facilitating growth and innovation in the digital economy. Much of this Bill is about balance between rights and about where those rights should or should not be applied. The Government’s view is that, on the whole, the GDPR was ultimately successful in achieving that balance. In particular, I reassure the Committee that there are already mechanisms in the new regime which will support individuals better to understand what data controllers are doing with their data for commercial purposes. For example, data controllers will be required, when obtaining personal data from an individual, to inform that person of: the purposes for which their personal data is being processed; the period for which their data will be stored, to the extent that this is possible; their right, where applicable, to withdraw consent for their data to be used; and their right to lodge a complaint with the supervisory authority. That is not an exhaustive list but is illustrative of the protections that will be put in place. Such information must also be updated if the controller intends to process the personal data for any new purpose.
I take this opportunity to add that the current statutory guidance from the Information Commissioner in relation to direct marketing states that, even if consent is not explicitly withdrawn, it will become harder for organisations legally to rely on that consent as time passes. On that basis, I am confident that the substance of the protections that the noble Lord is seeking to achieve through his amendment is already provided for.
In terms of the form that these protections take, the Government are concerned about the burdens that the noble Lord’s approach would have on businesses, particularly small and micro-enterprises. Many of his remarks were addressed to the large companies that we all know about. This is particularly true in respect of the final part of the noble Lord’s amendment, which would require organisations to notify an individual of the gross revenues that they expected to receive through the commercial use of their data.
The Government have sought in this Bill to minimise burdens on business. The Bill enables processing to support scientific research, journalism and many other areas. Where appropriate, it preserves the conditions and exemptions of the Data Protection Act, allowing business processing to continue. We want to support businesses to implement the new law, though we are in no doubt that updating processes and systems is not a trivial task. We believe that Amendment 71A is a burden that business does not want, and the economic consequence of overregulation of this sort is high risk where knowledge and data-driven industries can move easily to a more favourable regulatory environment.
The Government’s view is that, through this Bill, we are already establishing mechanisms which will empower individuals to make informed decisions about the use of their data. These measures will ensure that when people give their consent to an organisation which is using their data for commercial purposes, they do so on the basis of a shared understanding that any consent thereby given can be withdrawn at any time if they no longer wish their data to be used for certain purposes or to be monetised in certain ways.
I accept the broader issues that the noble Lord has raised. I think they are worthy of debate but I hope that, given my explanations on the specific areas that his amendment addresses, he feels able to withdraw it.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, I am very grateful to the Minister for engaging with the issues and for responding so positively to some of the ideas that underlie the amendment. This is an issue that we will need to come back to, but I take the point that the level of detail in the amendment and the impact it may have may not be appropriate at this time, in terms of our understanding of and knowledge about where we are trying to get to.

As the Minister said, there may be opportunities to discuss the way this might be taken forward, including the possibility of the data ethics group. Should the Bill be amended in that way, that would be a base on which this could come forward.

Having said that, this was clearly a probing amendment and I was not expecting a detailed response. The noble Lord was careful to make sure that we were aware of the problems concerning some of the issues, but I put it to him that the technology we are already experiencing—and there is a lot more to come—allows those who have our data to almost magically know things about us, which results in us getting birthday greetings, targeted adverts and everything else. They are already on to us on this, and I do not think we need to worry too much about the burden that might be placed on these poor companies. But I take the point and beg leave to withdraw the amendment.

Amendment 71A withdrawn.
Clause 12: Obligations of credit reference agencies
Debate on whether Clause 12 should stand part of the Bill.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, Clause 12 deals primarily with credit reference agencies. It is not an area that I think we want to go through in complete detail, but in comparing the current version of the Bill with the provisions in the Data Protection Act 1998, in particular Section 39(2), we wondered whether the updating of that provision was entirely correct and thought it would be helpful to give the Minister a chance to respond to that point.

The question that underlies the suggestion that the clause should not stand part is whether Clause 12 constitutes a restriction on a data subject’s access rights. It can be read as a presumption that a data subject in this area is asking only about their financial standing, and not for other data that the credit reference agency might have. The provision therefore might be said to run contrary to the underpinning rationale behind the GDPR that data controllers should be transparent and that data subjects should not be put in the position of having to guess what data is held about them in order to ask for it.

I am sorry to have to refer again to a recital, but recital 63, which the Minister might be aware of, specifies that among other purposes, the right of access is to allow a data subject to be aware of the data held about them so as to be able to,

“verify … the lawfulness of the processing”

that is taking place. This is different from the wording in Clause 12, in that the trigger appears to be based on the quantity of data rather than the type of controller. There is also no presumption about the nature of the data that the data subject wants. I think I have said enough to suggest that there is possibly an issue behind this and I would be grateful if the Minister could respond to that point.

Baroness Chisholm of Owlpen Portrait Baroness Chisholm of Owlpen (Con)
- Hansard - - - Excerpts

My Lords, as your Lordships know, before giving somebody credit, lenders such as banks, loan companies and shops want to be confident that the person can repay the money they lend. To help them do this, they may look at the information held by credit reference agencies.

Credit reference agencies give lenders a range of information about potential borrowers, which lenders use to make decisions about whether or not to offer a person credit. It is safe to say that the three main credit reference agencies in the UK—Equifax, Experian and Callcredit—are likely to hold certain information about most adults in the country. Most of the information held by the credit reference agencies relates to how a person has maintained their credit and their service and utility accounts. It also includes details of people’s previous addresses and information from public sources such as the electoral roll, public records including county court judgments, and bankruptcy and insolvency data.

The information held by the credit reference agencies is also used to verify the identity, age and residency of individuals, to identify and track fraud, to combat money laundering and to help recover payment of debts. Government bodies may also access this credit data to check that individuals are entitled to certain benefits and to recover unpaid taxes and similar debts. Credit reference agencies are licensed by the Financial Conduct Authority.

As noble Lords may be aware, anyone can write to a credit reference agency to request a copy of their credit reference file. Given the sheer volume of requests that such agencies receive, Section 9 of the Data Protection Act 1998 provides that a subject access request made under Section 7 of the Act will be taken to mean a request for information about the person’s financial standing, unless the person makes it clear that he or she is seeking different information. Very importantly, when responding to such a request, Section 9(3) of the 1998 Act requires the credit reference agencies to provide the person with details about how he or she can go about correcting any wrong information held by the agencies. The process for doing so is set out in Section 159 of the Consumer Credit Act 1974, and the 1998 Act makes reference to it. If personal information held about someone is incorrect or out of date, noble Lords will appreciate that it could lead to that person being unfairly refused credit.

Clause 12 of the Bill simply replicates the provisions in Section 9 of the DPA in relation to handling of subject access requests made under article 15 of the GDPR. If it were omitted without anything being put in its place, this could create uncertainty for consumer reference agencies about how they should respond to a subject access request. It would create uncertainty for data subjects, who would no longer be supplied with guidance on how to update details in their file that were wrong or misleading. As far as we are aware, these provisions have worked well over the last 20 years and we can see no reason why they should be omitted from the Bill.

On that basis, I respectfully invite the noble Lord to accept that Clause 12 should stand part of the Bill.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

I am grateful to the Minister for her response. I think we agree that any impact on one’s credit standing is a major issue and that it is really important that we get this right. Although she did not specifically say so, I take it that all the big companies involved in this field were consulted before this measure was put forward. One notices, but does not make any comment, that Equifax is one of the companies concerned—and look what happened to it.

The message coming through is that the DPA 1998 provisions are being reproduced here: there is no intention to change them and people should not be concerned about this. On that basis, I will not object to Clause 12 standing part of the Bill.

Clause 12 agreed.
Clause 13: Automated decision-making authorised by law: safeguards
Amendments 72 and 73
Moved by
72: Clause 13, page 7, line 9, leave out “prohibition on taking” and insert “Article 22(1) of the GDPR for”
73: Clause 13, page 7, line 10, leave out “for decisions”
Amendments 72 and 73 agreed.
Amendment 74
Moved by
74: Clause 13, page 7, line 11, at end insert—
“( ) A decision is “based solely on automated processing” for the purposes of this section if, in relation to a data subject, there is no meaningful input by a natural person in the decision-making process.”
Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, in moving Amendment 74, I will also speak to Amendments 74A, 75, 77, 119, 133A, 134 and 183—I think I have encompassed them all; at least I hope I have. In a way this is an extension of the very interesting debate that we heard on Amendment 71A, but further down the pipeline, so to speak. This group contains a range of possible and desirable changes to the Bill relating to artificial intelligence and the use of algorithms.

Data has been described, not wholly accurately, as the oil of artificial intelligence. With the advent of AI and its active application to datasets, it is vital that we strike the right balance in protecting privacy and the use of personal data. Indeed, the Minister spoke about that balance in that debate. Above all, we need to be increasingly aware of unintended discrimination where an element of a decision involves an algorithm. If a particular system learns from a dataset that contains biases, such as associating female names with family roles and male names with careers, it is likely to reproduce them in its decisions. One way of helping to identify and rectify bias is to ensure that such algorithms are transparent, so that it is possible to see not only what data is being used but the steps being taken to process that data in coming to a particular conclusion.

In all this, there is the major risk that we do not challenge computer-aided decision-making. To some extent, this is recognised by article 22 of the GDPR, which at least gives the right of explanation where there is fully automated decision-taking, and it is true that in certain respects, Clause 13 amplifies article 22. For instance, article 22 does not state what safeguards need to be in place; it talks just about proper safeguards. In the Bill, it is proposed that, after a decision has been made, the individual has to be informed of the outcome, which is better than what the GDPR currently offers. It also states that data subjects should have the right to ask that the decision be reconsidered or that the decision not be made by an algorithm. There is also the requirement, in certain circumstances, for companies and public bodies to undertake data protection impact assessment under Clause 62. There are also new provisions in the GDPR for codes of conduct and certification, so that if an industry is moving forward on artificial intelligence in an application, the ICO can certify the approach that the industry is taking on fairness in automated decision-taking.

18:30
However, the automated decision safeguards in the GDPR place too much emphasis on the requirement for a decision to be fully automated and significant before they apply. Few decisions are fully automated. Should there not also been the right to an explanation of systems where AI is only one part of the final decision in certain key circumstances—for instance, where policing, justice, health or personal welfare or finance is concerned? This could be an explanation in advance of the AI or algorithm being used—transparency by design—or, if the decision-making process is not known in advance, an obligation to test the AI’s performance in the same circumstances.
The automated decision safeguards in the GDPR should be amended explicitly to protect individuals against unfair and non-transparent semiautonomous AI systems that they may face in their day-to-day lives. For example, provision in the recent Digital Republic Act in France treats semiautonomous algorithms as requiring explanation.
To really ingratiate myself with the Minister—I may not succeed, but it is worth a try—I shall quote from a speech by Matt Hancock to the Leverhulme centre last July. He said that,
“we need to identify and understand the ethical and governance challenges posed by uses of data, now and into the future, where they go beyond current regulation, and then determine how best to identify appropriate rules … establish new norms, and where necessary regulations ... Unfair discrimination will still be unfair. Using AI to make some decisions may make those decisions more difficult to unpack. But it won’t make fairness less important”.
That is a very important paragraph in that speech to the Leverhulme centre.
On Amendments 74, 77 and 136, clarification is needed, as it is unclear whether the UK will consider the article 29 working party opinions after we leave the European Union, despite the central role of the ICO in crafting them. This is particularly relevant as the recently published draft guidelines on profiling by the article 29 working party state:
“The controller cannot avoid the Article 22 provisions by fabricating human involvement. For example, if someone routinely applies automatically generated profiles to individuals without any actual influence on the result, this would still be a decision based solely on automated processing.
To qualify as human intervention, the controller must ensure that any oversight of the decision is meaningful, rather than just a token gesture. It should be carried out by someone who has the authority and competence to change the decision. As part of the analysis, they should consider all the available input and output data”.
For the purpose of clarity of obligations imposed on controllers, it is important that this explanation is included in the Bill.
The effect of Amendment 77 would be that:
“A decision is a ‘significant decision’ for the purposes of this section if, in relation to a data subject, it—
(a) produces legal effects concerning the data subject, or
(b) significantly affects the data subject”—
or,
“a group sharing a protected characteristic, within the meaning of the Equality Act 2010, to which the data subject belongs”.
Take the example of an advertisement targeting people based on race. An example of this was discovered by a black Harvard computer science professor, Latanya Sweeney, who investigated why, “Have you been to prison and need legal help” adverts appeared online when googling “black-sounding” names rather than “white-sounding” names. Did the decision affect her? Unlikely: she is, as are many investigators, in a privileged position. But the amendment allows for people to take action on discriminatory systems even when they themselves might not be significantly affected at an individual level. This would usefully increase protection and explicitly define that a “significant” effect can be significant to a group of which an individual is part. This is similarly acknowledged by the recent article 29 working party guidance which states:
“Processing that might have little impact on individuals generally may in fact have a significant effect on certain groups of society, such as minority groups or vulnerable adults”.
Amendment 75 would clarify that the exemption from prohibition on taking significant decisions based solely on automated processing must not apply to purely automated decisions that engage an individual’s human rights. In relation to general automated processing in Clause 13, the explicit protection of human rights would protect individuals from being subjected to automated decisions that could engage their fundamental rights: for example, by unfairly discriminating against them. A recent study claimed that a facial recognition tool was able to detect individuals’ sexuality based on their photographs taken from online dating sites with greater accuracy than humans. Another recent study claimed that a machine learning tool was able to diagnose depression by scanning individuals’ photographs posted on the social media platform Instagram with greater accuracy than the average doctor.
The rapidly growing field of machine learning and algorithmic decision-making clearly presents new risks. As a minimum, individuals’ basic rights must be explicitly protected at all times and regarded as paramount.
On Amendment 183, personal data is of course already defined as data relating to a data subject which makes him or her identified or identifiable. The administrative decision then becomes one concerning him or her. This is a clarification of what “meaningful information” involves. There is evidence from both the article 29 committee and the ICO’s consultation that some tweaking of “solely” appears compatible with the GDPR. Under the Equality Act 2010, there is a public sector equality duty, and public agencies have an obligation to ensure that their actions, including the generation, purchase or use of machine learning models, do not have discriminatory effects.
On Amendment 119, where automated decisions are made under article 22, data subjects are permitted minimum safeguards. Recital 71 and paragraph 115 of the Government’s own Explanatory Notes suggest that this includes a right to explanation. However, as UK law has not traditionally used recitals—we heard previously that they do not form part of the Bill—it is unclear how they will be interpreted after they are migrated into domestic law as a potential consequence of the European Union (Withdrawal) Bill. This provision should be included in the main text of the Bill. Without passing an amendment, the Explanatory Notes would be incorrect in communicating the effect of the Bill.
I turn finally to Amendments 74A and 133A. These amendments are derived from a reading of recital 71, and the amendments themselves might be somewhat defective because they might give the impression that any safeguards are being deleted where children are involved. However, recital 71, when it applies to article 22, states that such measures should not concern a child. As I read that—a Minister may be able to clarify—the provisions related to automated decision-taking should not be allowable in connection with children. That requires clarification. In particular, not only is that rider to recital 71 in the recitals, there is a further recital in the GDPR, recital 38, which states:
“Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data”.
That all adds up to quite a number of areas in Clause 13 which have either not been properly transposed from article 22, or by some tweaking and clarification of definitions could vastly improve Clause 13. I beg to move, and I look forward to the Minister’s reply.
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, we have a number of amendments in this group and I want to associate myself with many of the points made on the other amendments by the noble Lord, Lord Clement-Jones. I was only sorry that we did not get round to signing up to more of them in time to get some of the glory, because he has picked up a lot of very interesting points.

We will come to later groups of amendments that deal with a broader concern of effects and moral issues in relation to this Bill. It has been growing on me for a number of weeks now, but one of the most irritating things about the Bill, apart from the fact that it does not have the main clauses in it that one wants to discuss, is that every now and again we come up against a brick wall where there is suddenly a big intellectual jump on where we have got to and where we might want to get to through the Bill, and this is one of them.

This whole idea of automated data and how it operates is very interesting indeed. One of the people with whom I have been having conversations around this suggested that, in processing this Bill, we are in danger of not learning from history in your Lordships’ House and indeed Parliament as a whole, in relation to other areas in which deep moral issues are raised. The point was made, which is a good one, that when Parliament was looking at the Human Fertilisation and Embryology Act 1990 there had been four or five years, perhaps slight longer, of pre-discussion in which all the big issues had been thrashed out both in public and in private—in both Houses and in the newspapers, and in private Bills. There were loads of attempts to try to get to the heart of the issue. We are missing that now, in a way that suggests that it will become a lot clearer when we have discussions later about a data ethics body. I am sure that they will be good and appropriate discussions.

Having said that, the issue here is extremely worrying. We are at the very start of a rich and very interesting development in how computers operate and how machines take away from us a lot of the work that we currently regard as being human work. It is already happening in the world go championship. A computer played the human go champion and beat them easily. Deep Blue, the IBM computer, beat Garry Kasparov the chess player a few years ago. The point is not so much that these things were happening, but that nobody could understand what the machines were doing in relation to the results they were achieving. It is that apparent ability to exceed human understanding that is the great worry behind what people say. Of course, it is quite a narrow area and not one that we need to be too concerned about in terms of a broader approach. But in a world where people say with a resigned shrug that the computer has said no to a request they have made to some website, it is a baleful reflection of the helplessness we all feel when we do not understand what computers are doing to us. Automated processing is one facet of that, and we have to be careful.

We have to think of people’s fears. If they have fears, they will not engage. If they will not engage, the benefits that should flow from this terrific new initiative, new thinking and new way of doing things will be that we do not get the productivity or the changes that will help society as we move forward. We have to think of future circumstances in a reflective way. In a deliberative way we have to think about technical development and public attitudes. It again plays back to the work that was done by Mary Warnock and her team when they were trying to introduce the HFEA. She said, importantly, that reason and sentiment are not necessarily opposed to each other. It is that issue we are trying to grapple with today. The amendments that have been so well introduced by the noble Lord, Lord Clement-Jones, cover that.

The regulatory and legal framework may not be sufficient. Companies obviously have natural duties only to their shareholders. Parliament will have to set rules that make people in those companies take account of public fears, as well as shareholder interests. That approach is not well exemplified in this Bill yet. We need to think about how to allow companies to bring forward new initiatives and push back the boundaries of what they are doing, while retaining public confidence. That is the sort of work that was done on the HFEA and that is where we have to go.

Our Amendment 74 has already been spoken to by the noble Lord, Lord Clement-Jones. It is an important one. There is an issue about whether or not an individual—“a natural person”, as the amendment has it—is involved “in the decision-making process”. We should know that.

18:45
Amendment 77A would ensure that data controllers must,
“provide meaningful information … significance and legal consequences”,
of the processing they are doing. Amendment 77B states that:
“A data subject affected by”,
automated decision-making,
“retains the right to lodge a complaint to the”,
ICO.
These are all consequences of the overall approach we are taking. I look forward to further debates and the Minister’s response.
Baroness Jones of Moulsecoomb Portrait Baroness Jones of Moulsecoomb (GP)
- Hansard - - - Excerpts

My Lords, I speak to Amendment 75 in particular, but the whole issue of automated decision-making is extremely worrying.

As we have gone through this Bill, I have been desperately hoping that some of the most repressive bits are a negotiating tactic on the Government’s part, and that before Report they will say, “We’ll take out this really nasty bit if you let us leave in this not really quite so nasty bit”. I feel that this issue is one of the really nasty bits.

I thank Liberty, which has worked incredibly hard on this Bill and drawn out the really nasty bits. Under the Data Protection Act 1998, individuals have a qualified right not to be subject to purely automated decision-making and, to the extent that automated decision-making is permitted, they have a right to access information relating to such decisions made about them. The GDPR clarifies and extends these rights to the point that automated decisions that engage a person’s human rights are not permissible.

This could include being subjected to unfair discrimination. The noble Lord, Lord Clement-Jones, used the phrase, “unintended discrimination”—for example, detecting sexuality or diagnosing depression. The rapidly growing field of machine learning and algorithmic decision-making presents some new and very serious risks to our right to a private life and to freedom of expression and assembly. Such automated decision-making is deeply worrying when done by law enforcement agencies or the intelligence services because the decisions could have adverse legal effects. Such processing should inform rather than determine officers’ decisions.

We must have the vital safeguard for human rights of the requirement of human involvement. After the automated decision-making result has come out, there has to be a human who says whether or not it is reasonable.

Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

My Lords, I too want to say a word about Amendment 75. The Human Rights Act trumps everything. To put it another way, the fundamental rights it deals with are incorporated into UK law, and they trump everything.

Like the noble Baroness, I believe that it is quite right that those who are responsible—humans—stop and think whether fundamental human rights are engaged. The right not to be subject to unfair discrimination has been referred to. Both the Bill and the GDPR recognised that as an issue in the provisions on profiling, but we need this overarching provision. Like other noble Lords, I find it so unsettling to be faced with what are clearly algorithmic decisions.

When I was on holiday I went to a restaurant in France called L’Algorithme, which was very worrying but I was allowed to choose my own meal. If this work continues in the industry, perhaps I will not be allowed to do so next year. I wondered about the practicalities of this, and whether through this amendment we are seeking something difficult to implement—but I do not think so. Law enforcement agencies under a later part of the Bill may not make significant decisions adversely affecting a data subject. Judgments of this sort must be practicable. That was a concern in my mind, and I thought that I would articulate my dismissal of that concern.

Lord Whitty Portrait Lord Whitty (Lab)
- Hansard - - - Excerpts

My Lords, my name is attached to two of these amendments. This is a very difficult subject in that we are all getting used to algorithmic decisions; not many people call them that, but they are what in effect decide major issues in their life and entice them into areas where they did not previously choose to be. Their profile, based on a number of inter-related algorithms, suggests that they may be interested in a particular commercial product or lifestyle move. It is quite difficult for those of my generation to grasp that, and difficult also for the legislative process to grasp it. So some of these amendments go back to first principles. The noble Baroness, Lady Hamwee, said that the issue of human rights trumps everything. Of course, we all agree with that, but human rights do not work unless you have methods of enforcing them.

In other walks of life, there are precedents. You may not be able to identify exactly who took a decision that, for example, women in a workforce should be paid significantly less than men for what were broadly equivalent jobs; it had probably gone on for decades. There was no clear paper trail to establish that discrimination took place but, nevertheless, the outcome was discriminatory. With algorithms, it is clear that some of the outcomes may be discriminatory, but you would not be able to put your finger on why they were discriminatory, let alone who or what decided that that discrimination should take place. Nevertheless, if the outcome is discriminatory, you need a way of redressing it. That is why the amendments to which I have added my name effectively say that the data subject should be made aware of the use to which their data is being made and that they would have the right of appeal to the Information Commissioner and of redress, as you would in a human-based decision-making process that was obscure in its origin but clear in relation to its outcome. That may be a slightly simplistic way in which to approach the issue, but it is a logical one that needs to be reflected in the Bill, and I hope that the Government take the amendments seriously.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.

The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.

The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.

The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.

I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,

“based solely on automated processing”,

to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.

In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I hesitate to interrupt the Minister, but it is written down in the recital that such a measure,

“should not concern a child”.

The whole of that recital is to do with automated processing, as it is called in the recital. The interpretation of that recital is going to be rather important.

19:00
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, I was coming to recital 71. In the example I gave, it seems odd to suggest that the NHS could at some future point use automated decision-making with appropriate safeguards to decide on the eligibility for a particular vaccine of an 82 year-old, but not a two year-old.

The noble Lord referred to the rather odd wording of recital 71. On this point, we agree with the Article 29 working party—the group of European regulators—that it should be read as discouraging as a matter of best practice automated decision-making with significant effects on children. However, as I have already said, there can and will be cases where it is appropriate, and the Bill rightly makes provision for those.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

Would the Minister like to give chapter and verse on how that distinction is made?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I think that “chapter and verse” implies “written”—and I will certainly do that because it is important to write to all noble Lords who have participated in this debate. As we have found in many of these areas, we need to get these things right. If I am to provide clarification, I will want to check—so I will take that back.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

I apologise for interrupting again. This is a bit like a dialogue, in a funny sort of way. If the Minister’s notes do not refer to the Article 29 working party, and whether or not we will continue to take guidance from it, could he include that in his letter as well?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I will. I had some inspiration from elsewhere on that very subject—but it was then withdrawn, so I will take up the offer to write on that. However, I take the noble Lord’s point.

We do not think that Amendment 75 would work. It seeks to prevent any decision being taken on the basis of automated decision-making where the decision would “engage” the rights of the data subject under the Human Rights Act. Arguably, such a provision would wholly negate the provisions in respect of automated decision-making as it would be possible to argue that any decision based on automated decision-making at the very least engaged the data subject’s right to have their private life respected under Article 8 of the European Convention on Human Rights, even if it was entirely lawful. All decisions relating to the processing of personal data engage an individual’s human rights, so it would not be appropriate to exclude automated decisions on this basis. The purpose of the Bill is to ensure that we reflect processing in the digital age—and that includes automated processing. This will often be a legitimate form of processing, but it is right that the Bill should recognise the additional sensitivities that surround it. There must be sufficient checks and balances and the Bill achieves this in Clauses 13 and 48 by ensuring appropriate notification requirements and the right to have a decision reassessed by non-automated means.

Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

As the Minister may be about to move on from that, I think he is saying that the phrase, “engages an individual’s rights” is problematic. Are the Government satisfied that the provisions the Minister has just mentioned adequately protect those rights—I am searching for the right verb—and that automated decision-making is not in danger of infringing the rights that are, as he says, always engaged?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Automated processing could do that. However, with the appropriate safeguards we have put in the Bill, we do not think that it will.

Amendment 77 seeks to define a significant decision as including a decision that has legal or similar effects for the data subject or a group sharing one of the nine protected characteristics under the Equality Act 2010 to which the data subject belongs.

We agree that all forms of discrimination, including discriminatory profiling via the use of algorithms and automated processing, are fundamentally wrong. However, we note that the Equality Act already provides a safeguard for individuals against being profiled on the basis of a particular protected characteristic they possess. Furthermore, recital 71 of the GDPR states that data controllers must ensure that they use appropriate mathematical or statistical procedures to ensure that factors which result in inaccuracies are minimised, and to prevent discriminatory effects on individuals,

“on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation”.

We therefore do not feel that further provision is needed at this stage.

Amendment 77A, in the name of the noble Lord, Lord Stevenson, seeks to require a data controller who makes a significant decision based on automated processing to provide meaningful information about the logical and legal consequences of the processing. Amendment 119, as I understand it, talks to a similar goal, with the added complication of driving a wedge between the requirements of the GDPR and applied GDPR. Articles 13 and 14 of the GDPR, replicated in the applied GDPR, already require data controllers to provide data subjects with this same information at the point the data is collected, and whenever it is processed for a new purpose. We are not convinced that there is much to be gained from requiring data controllers to repeat such an exercise, other than regulatory burden. In fact, the GDPR requires the information earlier, which allows the data subject to take action earlier.

Similarly, Amendment 77B seeks to ensure that data subjects who are the subject of automated decision-making retain the right to make a complaint to the commissioner and to access judicial remedies. Again, this provision is not required in the Bill, as data subjects retain the right to make a complaint to the commissioner or access judicial remedies for any infringement of data protection law.

Amendment 78 would confer powers on the Secretary of State to review the operational effectiveness of article 22 of the GDPR within three years, and lay a report on the review before Parliament. This amendment is not required because all new primary legislation is subject to post-legislative scrutiny within three to five years of receiving Royal Assent. Any review of the Act will necessarily also cover the GDPR. Not only that, but the Information Commissioner will keep the operation of the Act and the GDPR under review and will no doubt flag up any issues that may arise on this or other areas.

Amendment 153A would place a requirement on the Information Commissioner to investigate, keep under review and publish guidance on several matters relating to the use of automated data in the health and social care sector in respect of the terms on which enterprises gain consent to the disclosure of the personal data of vulnerable adults. I recognise and share noble Lords’ concern. These are areas where there is a particular value in monitoring the application of a new regime and where further clarity may be beneficial. I reassure noble Lords that the Information Commissioner has already contributed significantly to GDPR guidance being developed by the health sector and continues to work closely with the Government to identify appropriate areas requiring further guidance. Adding additional prescriptive requirements in the Bill is unlikely to help them shape that work in a way that maximises its impact.

As we have heard, Amendment 183 would insert a new clause before Clause 171 stating that public bodies who profile a data subject should inform the data subject of their decision. This is unnecessary as Clauses 13 and 48 state that when a data controller has taken a decision based solely on automated processing, they must inform the data subject in writing that they have done so. This includes profiling. Furthermore, Clauses 13 and 48 confer powers on the Secretary of State to make further provisions to provide suitable measures to safeguard a data subject’s rights and freedoms.

I thank noble Lords for raising these important issues, which deserve to be debated. I hope that, as a result of the explanation in response to these amendments, I have been able to persuade them that there are sufficient safeguards in relation to automated decision-making in the GDPR and Parts 2 to 4 of the Bill, and that their amendments are therefore unnecessary. On that basis, I invite noble Lords not to press their amendments.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I rather hope that the Minister has not been able to persuade noble Lords opposite. Certainly, I have not felt myself persuaded. First, on the point about “solely”, in recruiting these days, when big companies need to reduce a couple of thousand applications to 100, the general practice is that you put everything into an automated process—you do not really know how it works—get a set of scores at the end and decide where the boundary lies according to how much time you have to interview people. Therefore, there is human intervention—of course there is. You are looking at the output and making the decision about who gets interviewed and who does not. That is a human decision, but it is based on the data coming out of the algorithm without understanding the algorithm. It is easy for an algorithm to be racist. I just googled “pictures of Europeans”. You get a page of black faces. Somewhere in the Google algorithm, a bit of compensation is going on. With a big algorithm like that, they have not checked what the result of that search would be, but it comes out that way. It has been equally possible to carry out searches, as at various times in the past, which were similarly off-beam with other groups in society.

When you compile an algorithm to work with applications, you start off, perhaps, by looking at, “Who succeeds in my company now? What are their characteristics?”. Then you go through and you say, “You are not allowed to look at whether the person is a man or a woman, or black or white”, but perhaps you are measuring other things that vary with those characteristics and which you have not noticed, or some combinations. An AI algorithm can be entirely unmappable. It is just a learning algorithm; there is no mental process that a human can track. It just learns from what is there. It says, “Give me a lot of data about your employees and how successful they are and I will find you people like that”.

At the end of the day, you need to be able to test these algorithms. The Minister may remember that I posed that challenge in a previous amendment to a previous Bill. I was told then that a report was coming out from the Royal Society that would look at how we should set about testing algorithms. I have not seen that report, but has the Minister seen it? Does he know when it is coming out or what lines of thinking the Royal Society is developing? We absolutely need something practical so that when I apply for a job and I think I have been hard done by, I have some way to do something about it. Somebody has to be able to test the algorithm. As a private individual, how do you get that done? How do you test a recruitment algorithm? Are you allowed to invent 100 fictitious characters to put through the system, or should the state take an interest in this and audit it?

We have made so much effort in my lifetime and we have got so much better at being equal—of course, we have a fair way to go—doing our best continually to make things better with regard to discrimination. It is therefore important that we do not allow ourselves to go backwards because we do not understand what is going on inside a computer. So absolutely, there has to be significant human involvement for it to be regarded as a human decision. Generally, where there is not, there has to be a way to get a human challenge—a proper human review—not just the response, “We are sure that the system worked right”. There has to be a way round which is not discriminatory, in which something is looked at to see whether it is working and whether it has gone right. We should not allow automation into bits of the system that affect the way we interact with each other in society. Therefore, it is important that we pursue this and I very much hope that noble Lords opposite will give us another chance to look at this area when we come to Report.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I thank all noble Lords who spoke in the debate. It has been wide-ranging but extremely interesting, as evidenced by the fact that at one point three members of the Artificial Intelligence Select Committee were speaking. That demonstrates that currently we live, eat and breathe artificial intelligence, algorithms and all matters related to them. It is a highly engaged committee. Of course, whatever I put forward from these Benches is not—yet—part of the recommendations of that committee, which, no doubt, will report in due course in March.

19:15
I very much like the analogy the noble Lord, Lord Stevenson, drew between this debate and the human fertilisation and embryology debate, and I noticed that the Minister picked up on that. Providing the ethical framework for AI and the use of algorithms will be extremely important in the future, and in due course we will come on to debate what kind of body might be appropriate to set standards and ethical principles. I quoted the Minister, Matt Hancock, because that speech was all about creating public trust so that we can develop the beneficial uses of artificial intelligence while avoiding its perils—the noble Lord, Lord Lucas, put his finger on some of the issues. That will be important if we are to get acceptance of this new technology as it develops, particularly as we move from what might be called weak AI towards strong, general AI. We do not know what the timescale will be, but it will be particularly important to create that level of public trust. So it is extremely important in this context to kick around concepts of accountability, explanation, transparency, and so on.
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I highlight that we do not disagree with that. I will study carefully what my noble friend Lord Lucas said. We agree that it is important that privacy rights continue to be protected, and we do not expect data subjects to have their lives run by computer alone. That is exactly why the Bill creates safeguards: to make sure that individuals can request not to be the subject of decisions made automatically if it might have a significant legal effect on them. They are also allowed to demand that a human being participate meaningfully in those decisions that affect them. I will look at what my noble friend said and include that in my write-round. However, as I said, we do not disagree with that. The illusion that we have got to a stage where our lives will be run unaccountably by computers is exactly what the Bill is trying to prevent.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, I would not want to give that impression. None of us are gloom merchants in this respect. We want to be able to harness the new technology in a way that is appropriate and beneficial for us, and we do that by setting the right framework in data protection, ethical behaviour and so on.

I am grateful to the Minister for engaging in the way he has on the amendments. It is extremely important to probe each of those areas of Clauses 13, 47 and 48. For instance, there are lacunae. The Minister talked about the right to be informed and the right to challenge, and so on, and said that these provided adequate and proportional safeguards, but the right to explanation is not absolutely enshrined, even though it is mentioned in the GDPR. So in some areas we will probe on that.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, if it is mentioned in the GDPR, then it is there.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

Yes, my Lords, but it is in the recital, so I think we come back again to whether the recitals form part of the Bill. That is what I believe to be the case. I may have to write to the Minister. Who knows? Anything is possible.

One of the key points—raised by the noble Lord, Lord Lucas—is the question of human intervention being meaningful. To me, “solely”, in the ordinary meaning of the word, does not mean that human intervention is there at all, and that is a real worry. The writ of the article 29 working group may run until Brexit but, frankly, after Brexit we will not be part of the article 29 working group, so what interpretation of the GDPR will we have when it is incorporated into UK domestic law? If those rights are not to be granted, the interpretation of “solely” with the absolute requirement of human involvement needs to be on the face of the Bill.

As far as recital 71 is concerned, I think that the Minister will write with his interpretation and about the impact of the article 29 working group and whether we incorporate its views. If the Government are not prepared to accept that the rulings of the European Court of Justice will be effective in UK law after Brexit, I can only assume that the article 29 working group will have no more impact. Therefore, there is a real issue there.

I take the Minister’s point about safeguards under the Equality Act. That is important and there are other aspects that we will no doubt wish to look at very carefully. I was not overly convinced by his answer to Amendment 75, spoken to by the noble Baroness, Lady Jones, and my noble friend Lady Hamwee, because he said, “Well, it’s all there anyway”. I do not think we would have had to incorporate those words unless we felt there was a gap in the way the clause operated.

I will not take the arguments any further but I am not quite as optimistic as the Minister about the impact of that part of the Bill, and we may well come back to various forms of this subject on Report. However, it would be helpful if the Minister indicated the guidance the ICO is adopting in respect of the issue raised in Amendment 153A. When he writes, perhaps he could direct us to those aspects of the guidance that will be applicable in order to help us decide whether to come back to Amendment 153A. In the meantime, I beg leave to withdraw.

Amendment 74 withdrawn.
Amendments 74A and 75 not moved.
Amendment 76
Moved by
76: Clause 13, page 7, line 15, at beginning insert “similarly”
Amendment 76 agreed.
Amendments 77 to 77B not moved.
Clause 13, as amended, agreed.
Amendment 78 not moved.
Amendment 78A
Moved by
78A: After Clause 13, insert the following new Clause—
“Personal Data Ethics Advisory Board
(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board as soon as reasonably practicable after the passing of this Act.(2) The Personal Data Ethics Advisory Board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are to—(a) monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;(b) protect the individual and collective rights and interests of data subjects in relation to their personal data;(c) ensure that trade-offs between the rights of data subjects and the use and management of personal data are made transparently, accountably and inclusively;(d) seek out good practices and learn from successes and failures in the use and management of personal data; and(e) enhance the skills of data subjects and controllers in the use and management of personal data.(3) The Personal Data Ethics Advisory Board must report annually to the Secretary of State.(4) The report in subsection (3) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—(a) monitoring and evaluating the use and management of personal data;(b) sharing best practice and setting standards for data controllers; and(c) clarifying and enforcing data protection rules.(5) The Secretary of State must lay the report in subsection (3) before both Houses of Parliament.”
Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, it always used to be said that reaching the end of your Lordships’ day was the graveyard slot. This is a bit of a vice slot. You are tempted by the growing number of people coming in to do a bit of grandstanding and to tell them what they are missing in this wonderful Bill that we are discussing. You are also conscious that the dinner hour approaches—and I blame the noble Baroness, Lady Hamwee, for that. All her talk of dining in L’Algorithme, where she almost certainly had a soup, a main course and a pudding, means that it is almost impossible to concentrate for the six minutes that we will be allowed—with perhaps a few minutes more if we can be indulged—to finish this very important group. It has only one amendment in it. If noble Lords did not know that, I bet that has cheered them up. I am happy to say that it is also a réchauffage, because we have already discussed most of the main issues, so I will be very brief in moving it.

It is quite clear from our discussion on the previous group that we need an ethics body to look at the issues that we were talking about either explicitly or implicitly in our debates on the previous three or four groups and to look also at moral and other issues relating to the work on data, data protection, automatics and robotics, and everything else that is going forward in this exciting field. The proposal in Amendment 78A comes with a terrific pedigree. It has been brought together by members of the Royal Society, the British Academy, the Royal Statistical Society and the Nuffield Trust. It is therefore untouchable in terms of its aspirations and its attempt to get to the heart of what should be in the contextual area around the new Bill.

I shall not go through the various points that we made in relation to people’s fears, but the key issue is trust. As I said on the previous group, if there is no trust in what is set up under the Bill, there will not be a buy-in by the general public. People will be concerned about it. The computer will be blamed for ills that are not down to it, in much the same way that earlier generations always blamed issues external to themselves for the way that their lives were being lived. Shakespeare’s Globe was built outside the city walls because it was felt that the terribly dangerous plays that were being put on there would upset the lieges. It is why penny dreadfuls were banned in the early part of the last century and why we had a fight about video nasties. It is that sort of approach and mentality that we want to get round to.

There is good—substantial good—to be found in the work on automation and robotics that we are now seeing. We want to protect that but in the Bill we are missing a place and a space within which the big issues of the day can be looked at. Some of the issues that we have already talked about could easily fit with the idea of an independent data ethics advisory board to monitor further technical advances in the use and management of personal data and the implications of that. I recommend this proposal to the Committee and beg to move.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Stevenson, has been admirably brief in the pre-dinner minutes before us and I will be brief as well. This is a very important aspect of the debate and, despite the fact that we will be taking only a few minutes over it, I hope that we will return to it at a future date.

I note that the Conservative manifesto talked about a data ethics body, and this is not that far away from that concept. I think that the political world is coalescing around the idea of an ethics stewardship body of the kind recommended by the Royal Society and the British Academy. Whatever we call it—a rose by any other name—it will be of huge importance for the future, perhaps not as a regulator but certainly as a setter of principles and of an ethical context in which AI in particular moves forward.

The only sad thing about having to speed up the process today is that I am not able to take full advantage of the briefing put forward by the Royal Society. Crucially, it recommends two things. The first is:

“A set of high-level principles to help visibly shape all forms of data governance and ensure trustworthiness and trust in the management and use of data as a whole”.


The second is:

“A body to steward the evolution of the governance landscape as a whole. Such a stewardship body would be expected to conduct expert investigation into novel questions and issues, and enable new ways to anticipate the future consequences of today’s decisions”.


This is an idea whose time has come and I congratulate the noble Lords, Lord Stevenson and Lord Kennedy, on having tabled the amendment. I certainly think that this is the way forward.

19:30
Lord Puttnam Portrait Lord Puttnam (Lab)
- Hansard - - - Excerpts

My Lords, having restrained myself for four and a half hours and having done a huge amount of work in the Library, I will, despite the amendment having been given only a few minutes, detain your Lordships for a few more moments. This is a massive issue.

As a member of the AI committee chaired by the noble Lord, Lord Clement-Jones, I have been struggling to find analogies for just how serious the world we are moving into is becoming. What I have come up with, with the help of the Library, is road safety. I am going to talk about ethics. Probably the most well-known and successful ethicist in your Lordships’ Chamber is the noble Baroness, Lady O’Neill. Last week, when discussing what this Bill is really all about, she put her finger on it. She asked of the Minister:

“Is he suggesting that the aim should be to adapt children to the realities of the online world and the internet service providers, rather than to adapt the providers to the needs of children?”.—[Official Report, 6/11/17; col. 1606.]


This seems to be fundamental to the issue. Because I needed an analogy, I started looking into road safety, and found it very interesting and—if noble Lords will give me a couple of minutes—rather instructive.

In 1929, a royal commission met, having been required to urgently legislate on road safety because of the “slaughter” that was occurring on the roads. I will not take up your Lordships’ time reading out all the information that I got from the Library, but I have it all here. Parliament legislated in 1930, pretty ineffectively, and again in 1932, again ineffectively. In 1934, your Lordships’ House passed a Bill on road safety, which was rejected in another place because of the objections of lobbyists from the automobile industry, the oil industry and the insurance industry. Parliament tried again in 1938, and once again failed.

Here, I must read something extraordinary. Lord Cecil of Chelwood, a Conservative Peer, said at the end of the debate on the report regarding the legislation:

“I believe future ages will regard with consternation the complacency, the indifference with which this slaughter and mutilation on the roads is now regarded. I observe with great interest that in the final paragraph of the Report the members of the Committee themselves say that they are puzzled and shocked … by the complacency with which this matter is regarded”.—[Official Report, 3/5/1939; col. 903.]


Thousands of people were being killed. I put it to the House that if we get this Bill wrong, a lot of people will be hurt; if we get it right, we may save lives. That is how important it is.

I am standing here today because of a man named Ralph Nader. Through an extraordinary series of events in the 1960s, Ralph Nader was able to impose on the American automobile industry, against its wishes, seatbelts. Six years ago in Italy, my life was saved by the combination of a seatbelt and an airbag, so I take this issue pretty seriously. Look at what has happened since 1990 to the number of lives saved by the utilisation of technology that existed 20, 30 and 40 years prior to that—it is extraordinary. In 1930, almost 8,000 people were killed on the roads of Britain, with one million registered vehicles on the road. Last year, fewer than 2,000 people were killed, with 35 million registered vehicles on the road. That is because, at last, technology was brought to bear—against the wishes of the industry lobbyists.

We must understand that there are those who would like this Data Protection Bill to be weak. It is our duty to ourselves and to future generations to make it extremely tough and to not allow ourselves to be undermined by the views of the many sectors of industry that do not share our values.

Lord Patel Portrait Lord Patel (CB)
- Hansard - - - Excerpts

My Lords, it is a pity I have to be brief, but I will try. The amendment is interesting and worth debating in greater detail than the time today allows. Remarks have already been made about the report from the Royal Society and the British Academy, which suggested setting up a body but did not define whether it ought to be statutory. It is a pity it did not because, if it had, perhaps the Government would have taken greater notice of the suggestion and taken on board what pages 81 and 82 of their manifesto said that they would do—set up a commission.

To me, there are three important things for any body that is set up. First, it must articulate and provide guidance on the rules, standards and best practices for data use, ideally covering both personal and non-personal data. I see this amendment as restrictive in that area. Secondly, it must undertake horizon scanning to identify potential ethical, social and legal issues emerging from new and innovative uses of data, including data linkage, machine learning and other forms of artificial intelligence, and establish how these should be addressed. Thirdly, and importantly, it should be aligned with, and not duplicate, the roles of other bodies, including the ICO as the data protection regulator and ethics committees making decisions about particular research proposals using people’s data. This important amendment allows us to discuss such issues and I hope we will return to it and perhaps make it wider.

Is such a body necessary? The debates we have had suggest that it might be. The Nuffield Foundation was mentioned. It has suggested that it will set up an ethics commission, and we need to know what the purpose of that will be. What would its role be in the regulatory framework, because it would not be a statutory body? I look forward to that debate but, in the meantime, I support the amendment.

Earl of Erroll Portrait The Earl of Erroll (CB)
- Hansard - - - Excerpts

My Lords, I support the amendment and its very simple principle. We live in a complex world and this tries to lay rules on a complex system. The trouble is that rules can never work because they will never cover every situation. You have to go back to the basic principles and ethics behind what is being done. If we do not think about that from time to time, eventually the rules will get completely out of kilter with what we are trying to achieve. This is essential.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, clearly the Royal Society has been talking to other people. I hope that someone from there is listening and will be encouraged to talk to me too. I am delighted with this amendment and think it is an excellent idea, paired with Amendment 77A, which gives individuals some purchase and the ability to know what is going on. Here we have an organisation with the ability to do something about it, not by pulling any levers but by raising enough of a storm and finding out what is going on to effect change. Amendments 77A and 78A are a very good answer to the worries we have raised in this area.

It is important that we have the ability to feel comfortable and to trust—to know that what is going on is acceptable to us. We do not want to create divisions, tensions and unhappiness in society because things are going on that we do not know about or understand. As the noble Lord said, the organisations running these algorithms do not share our values—it is hard to see that they have any values at all other than the pleasures of the few who run them. We should not submit to that. We must, in all sorts of ways, stand up to that. There are many ways in which these organisations have an impact on our lives, and we must insist that they do that on our terms. We are waking up quite slowly. To have a body such as this, based on principles and ethics and with a real ability to find out what is going on, would be a great advance. It would give me a lot of comfort about what is happening in this Bill, which otherwise is just handing power to people who have a great deal of power already.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

My Lords, the noble Lord, Lord Stevenson, has raised the important issue of data ethics. I am grateful to everyone who has spoken on this issue tonight and has agreed that it is very important. I assure noble Lords that we agree with that. We had a debate the other day on this issue and I am sure we will have many more in the future. The noble Lord, Lord Puttnam, has been to see me to talk about this, and I tried to convince him then that we were taking it seriously. By the sound of it, I am not sure that I completely succeeded, but we are. We understand the points he makes, although I am possibly not as gloomy about things as he is.

We are fortunate in the UK to have the widely respected Information Commissioner to provide expert advice on data protection issues—I accept that that advice is just on data protection issues—but we recognise the need for further credible and expert advice on the broader issue of the ethical use of data. That is exactly why we committed to setting up an expert advisory data ethics body in the 2017 manifesto, which, I am glad to hear, the noble Lord, Lord Clement-Jones, read carefully.

Lord Clement-Jones Portrait Lord Clement-Jones
- Hansard - - - Excerpts

We like to hold the Government to their manifesto commitments occasionally.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

Tonight the noble Lord can because the Secretary of State is leading on this important matter. She is as committed as I am to ensuring that such a body is set up shortly. She has been consulting widely with civil society groups, industry and academia, some of which has been mentioned tonight, to refine the scope and functions of the body. It will work closely with the Information Commissioner and other regulators. As the noble Lords, Lord Clement-Jones and Lord Patel, mentioned, it will identify gaps in the regulatory landscape and provide Ministers with advice on addressing those gaps.

It is important that the new advisory body has a clearly defined role and a strong relationship to other bodies in this space, including the Information Commissioner. The Government’s proposals are for an advisory body which may have a broader remit than that suggested in the amendment. It will provide recommendations on the ethics of data use in gaps in the regulatory landscape, as I have just said. For example, one fruitful area could be the ethics of exploiting aggregated anonymised datasets for social and commercial benefit, taking into account the importance of transparency and accountability. These aggregated datasets do not fall under the legal definition of personal data and would therefore be outside the scope of both the body proposed by the noble Lord and, I suspect, this Bill.

Technically, Amendment 78 needs to be more carefully drafted to avoid the risk of non-compliance with the GDPR and avoid conflict with the Information Commissioner. Article 51 of the GDPR requires each member state to appoint one or more independent public authorities to monitor and enforce the GDPR on its territory as a supervisory authority. Clause 113 makes the Information Commissioner the UK’s sole supervisory authority for data protection. The functions of any advisory data ethics body must not cut across the Information Commissioner’s performance of its functions under the GDPR.

The amendment proposes that the advisory board should,

“monitor further technical advances in the use and management of personal data”.

But one of the Information Commissioner’s key functions is to

“keep abreast of evolving technology”.

That is a potential conflict we must avoid. The noble Lord, Lord Patel, alluded to some of the conflicts.

Nevertheless, I agree with the importance that noble Lords place on the consideration of the ethics of data use, and I repeat that the Government are determined to make progress in this area. However, as I explained, I cannot agree to Amendment 78 tonight. Therefore, in the light of my explanation, I hope the noble Lord will feel able to withdraw it.

Baroness Hamwee Portrait Baroness Hamwee
- Hansard - - - Excerpts

Before the noble Lord, Lord Stevenson, responds—he will probably make this point better than I can—have we just heard from the Minister an outline of an amendment the Government will bring forward in order to enshrine the body they are advocating? He will understand that, whichever side of the House you are on, you are always aware that a future Government may not have the same ways of going about things as the Government he is supporting at the moment, and whose proposals are entirely laudable. Things may change.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I cannot agree with the noble Baroness’s point. However, I accept that that is a possibility and that things will not last for ever. However, in this case we expect to have the proposals shortly and this Government will definitely be around at that time.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

My Lords, I think that is a yes.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

The noble Baroness asked whether it would be enshrined in this Bill. As I tried to explain, it will have a far broader remit than this Bill.

Lord Stevenson of Balmacara Portrait Lord Stevenson of Balmacara
- Hansard - - - Excerpts

That is a no, then. Oh well, these things happen. You are up one minute and then down. We cannot live like this, can we? However, it is only the Committee stage and we have plenty of time. We can presumably inveigle the Minister into a meeting about this. Not with everyone concerned because that would be too much, but I would be happy to meet him about this on neutral turf if possible. I am fairly confident that we would not want to see the Government voting against a manifesto commitment, which I think I heard him say. We can be reasonably certain that progress can be made on this issue and I wish to signal here our considerable support for that. I look forward to the discussions and beg leave to withdraw the amendment.

Amendment 78A withdrawn.
House resumed. Committee to begin again not before 8.45 pm.