House of Commons (25) - Commons Chamber (10) / Written Statements (6) / Westminster Hall (5) / Petitions (2) / Public Bill Committees (2)
House of Lords (25) - Lords Chamber (23) / Grand Committee (2)
(6 years, 9 months ago)
Public Bill CommitteesI beg to move amendment 127 in schedule 17, page 206, line 15, leave out paragraph (a) and insert—
“(a) a relevant health record (see paragraph 1A),”.
This amendment, with Amendment 128, limits the types of health records (defined in Clause 198) which count as “relevant records” for the purposes of Clause 181 (prohibition of requirement to produce relevant records) to those obtained by a data subject in the exercise of a data subject access right (defined in paragraph 4 of Schedule 17).
A subject access request gives individuals the right to ask for all the personal information that an organisation holds about them. That is a powerful right, designed to ensure that individuals may access information held about them within a specified time and, as such, it needs to be protected. The Bill provides such protection by making it an offence to require someone to exercise the right as a condition of employment, a contract or the provision of a service or goods. That is set out in clause 181 and schedule 17 and is intended to substantively replicate and in places build on the comparable provision in section 56 of the Data Protection Act 1998.
Amendments 127 and 128 insert a definition of a “relevant health record” for the purposes of clause 181, to ensure that the scope is consistent with that of other types of “relevant record” set out in schedule 17. Amendment 181 is technical in nature and simply updates a reference to a piece of legislation in Northern Ireland to reflect the fact that the legislation has been replaced.
I thank the Minister for that explanation. She is absolutely right to say that subject access requests are extremely powerful in how they operate. It is therefore such a shame that they are not a right or a power that the Government will see fit to extend to newcomers to this country, who will seek to use and have in the past sought to use subject access requests to access important information about their immigration status and history, and the decision-making processes in the Home Office and UK Border Agency about their immigration status. I am sure that we will come back to this debate on Report, and I hope that it is something that the Minister will reflect on.
Amendment 127 agreed to.
Amendments made: 128 in schedule 17, page 206, line 21, at end insert—
“Relevant health records
1A ‘Relevant health record’ means a health record which has been or is to be obtained by a data subject in the exercise of a data subject access right.”.
See the explanatory statement for Amendment 127.
Amendment 181 in schedule 17, page 207, line 22, leave out sub-paragraph (iii) and insert—
“(iii) Article 45 of the Criminal Justice (Children) (Northern Ireland) Order 1998 (S.I. 1998/1504 (N.I. 9));”.—(Margot James.)
In a list of functions of the Secretary of State in relation to people sentenced to detention, this amendment removes a reference to section 73 of the Children and Young Persons Act 1968 (which has been repealed) and inserts a reference to Article 45 of the Criminal Justice (Children) (Northern Ireland) Order 1998 (which replaced it).
Schedule 17, as amended, agreed to.
Clause 182 ordered to stand part of the Bill.
Clause 183
Representation of data subjects
Amendments made: 63, in clause 183, page 105, line 42, leave out “80” and insert “80(1)”.
This amendment changes a reference to Article 80 of the GDPR into a reference to Article 80(1) and is consequential on NC2.
Amendment 64, in clause 183, page 105, line 44, leave out “certain rights” and insert “the data subject’s rights under Articles 77, 78 and 79 of the GDPR (rights to lodge complaints and to an effective judicial remedy)”.
In words summarising Article 80(1) of the GDPR, this amendment adds information about the rights of data subjects that may be exercised by representative bodies under that provision.
Amendment 65, in clause 183, page 106, line 7, leave out “under the following provisions” and insert “of a data subject”.
This amendment and Amendments 66, 67 and 68 tidy up Clause 183(2).
Amendment 66, in clause 183, page 106, line 9, at beginning insert “rights under”.
See the explanatory statement for Amendment 65.
Amendment 67, in clause 183, page 106, line 10, at beginning insert “rights under”.
See the explanatory statement for Amendment 65.
Amendment 68, in clause 183, page 106, line 11, at beginning insert “rights under”.—(Margot James.)
See the explanatory statement for Amendment 65.
Clause 183, as amended, ordered to stand part of the Bill.
Clause 184
Data subject’s rights and other prohibitions and restrictions
Amendment made: 69, in clause 184, page 106, line 41, leave out “(including as applied by Chapter 3 of that Part)”.—(Margot James.)
This amendment is consequential on Amendment 4.
Clause 184, as amended, ordered to stand part of the Bill.
Ordered,
That clause 184 be transferred to the end of line 39 on page 105.—(Margot James.)
Clause 185
Framework for Data Processing by Government
Question proposed, That the clause stand part of the Bill.
I seek a bit of reflection and clarification from the Minister on this point. Clause 185 touches on the way in which the data processing regime operates for Her Majesty’s Government. Within Her Majesty’s Government, there are three very significant Departments that employ tens of thousands of people and process millions of bits of data every year. The three big data-processing parts of Her Majesty’s Government are the Department for Work and Pensions, Her Majesty’s Revenue and Customs and the Ministry of Defence. Very often, the formal data controller is the person who sits at the top of the office. Sometimes it is someone who has a relationship with the accounting officer at the top of the Department. The challenge that that creates for people who seek to exercise their data rights under this Bill is that subject access requests or other requests go into the Department, and it takes for ever to get a response. That is not a reflection on the quality of the civil servants who run the Departments; it is simply that they are sitting on top of millions of records—potentially hundreds of millions of bits of data—and the records may be held or processed by thousands of people operating at the frontline of a particular business.
The way we get around that problem in the national health service, which is probably the biggest Government data processor in the country, is that the data processor is often nominated at the trust level. The data controller may be a clinical commissioning group or an NHS hospital trust. The big Departments—the DWP, the MOD and HMRC—do not operate that strategy. It would be useful to know whether the Government, in the codes of practice that they issue to Departments, will persist with the practice of nominating data controllers at the very top, so that there will be a single data controller in a very large Department with ultimate responsibility for enforcing the Bill right the way through some of the biggest and most complex organisations on earth.
The Minister will know, having long been in her role, that all kinds of problems arise, particularly in the DWP, when information is sought, for example, for tribunal cases. If someone is bringing a tribunal case or wants to contest something about benefits, sometimes the fastest way to do that is to file a subject access request just to get in one place how HMRC or the DWP did the calculations. Like the rest of us, the Minister will have had surgery cases along those lines. The first thing to do is to try to create a single picture of how the Department came to the decisions it made, which have a material impact on our benefits, health and wellbeing.
If the only way to assemble that full picture is to file a subject access request right the way up the chain to a civil servant at the top of the organisation, that is a very slow and fraught process. I invite the Minister to say a bit more about how she will reflect on a very different strategy for appointing and managing data controllers in the NHS, compared with the strategy that currently pertains in those three big administrative parts of Her Majesty’s Government.
The right hon. Gentleman makes a very good point. It might help if I say a little about the framework that the Secretary of State has to issue, as directed by clause 185, about the processing of personal data in connection with the exercise of functions within Government. Before the framework is issued, it has to be subject to parliamentary scrutiny. Some of these practical issues can be explored at that point. The framework will provide guidance to Departments on all aspects of their data processing. The content is being developed and we will definitely take into account the right hon. Gentleman’s concerns.
Question put and agreed to.
Clause 185 accordingly ordered to stand part of the Bill.
Clause 186
Approval of the Framework
Question proposed, That the clause stand part of the Bill.
I am grateful to the Minister for taking those points on board. I suppose it begs the question of when she thinks we might see this framework. The process set out in the clause is a wise and practical course of action. We all have constituency experience that could have a bearing on how this piece of guidance is drafted and presented. We have the luxury of serving our constituents week in, week out. That is not a privilege that the civil servants who are asked to draft these frameworks enjoy.
It is important that the Minister goes through a good process, which allows her not to present the House with a fait accompli or something for an up and down motion. That will not be in any of our interests. My concern is how we practically operationalise this in a way that allows us continually to strengthen and improve the service that we provide to our constituents. It is very hard for us to do that if we have a data management regime operationalised by Her Majesty’s Government that gets in the way.
When does the Minister expect to issue this framework? How will she ensure that there is a period of soft consultation with, perhaps, the Speaker’s Committee here in the House so that we are not presented with a final draft of a document that we have 40 days to consider, moan about and make representations about, all of which will then basically be ignored because the approval process requires an up-down vote at the end.
I cannot be precise as to when, but it will be a priority to issue the framework for all the reasons that the right hon. Gentleman set out. We intend to engage fully with officials across Government, in particular the Departments that he has mentioned, and will consult other areas of expertise and the Information Commissioner herself. Indeed, clause 185(5) sets a requirement for consultation. Most importantly, the framework will then come to Parliament for proper scrutiny. At that point the right hon. Gentleman will have every chance to contribute further to the practicality of establishing this framework as speedily as possible.
Question put and agreed to.
Clause 186 accordingly ordered to stand part of the Bill.
Clause 187
Publication and review of the Framework
Question proposed, That the clause stand part of the Bill.
The only issue arising from this clause is the frequency with which the Minister expects the framework to be updated. I welcome the steer that she has given the Committee about how clause 186(5) will be operationalised, but that does not quite get round the problem that I am concerned about. Sometimes, and it has been known to happen, regulations get somewhat hard wired before they are presented to the House. Although it is in the Bill, sometimes that 40-day consultation period does not provide an opportunity to revise and update a measure if we do not think that it is practical.
If, for example, a code of practice is brought forward that says, “For the DWP, the data controller is going to be the accounting officer of the Department or someone associated with the accounting officer of the Department,” that is not going to be a practical strategy for operationalising this Bill within a Department as big and complicated as the DWP. So it may not be possible. We have to accept that. We have to accept the way statutory instruments are put through this place, and the political reality of that. Let us be mature about that. However, we have a belt-and-braces approach set out in clause 187, in that we have the chance to review it. Perhaps the Minister could say a word about how frequently she expects to review and update the legislation, so that it continually improves in the light of experience?
Clause 187 requires the Secretary of State to publish the framework, and under clause 185 he must keep it under review, and commit to updating it as appropriate. Furthermore, although the Information Commissioner has to take the framework into account, were she investigating a data breach by a Government Department, for example, she might consider it relevant to consider whether that Department had applied the principles set out in the framework. She is also free to disregard the framework if she considers it irrelevant or getting in the way.
It will be a moving thing, and the legislation provides for the Secretary of State to keep it under continual review. If the right hon. Gentleman wishes to have some input before it arrives in the House in the form of a Statutory Instrument, I would be very happy to engage with him.
Question put and agreed to.
Clause 187 accordingly ordered to stand part of the Bill.
Clause 188 ordered to stand part of the Bill.
Clause 189
Publication and review of the Framework
Question proposed, That the clause stand part of the Bill.
We now come to offences, and crucially in clause 189, the question of penalties for offences. The real world has provided us with some tests for the legislation over the past few days. We have reviewed clauses 189 to 192 again in the light of this week’s news. Some quite serious questions have been provoked by the Cambridge Analytica scandal, and the revelations about the misuse of data that was collected through an app that sat on the Facebook platform.
For those who missed it, the story is fairly simple. A Cambridge-based academic created an app that allowed the collection not only of personal data but of data associated with one’s friends on Facebook. The data was then transferred to Cambridge Analytica, and that dataset became the soft code platform on which forensic targeting was deployed during the American presidential elections. We do not yet know, because the Mueller inquiry has not been completed, who was paying for the dark social ads targeted at individuals, as allowed by Cambridge Analytica’s methodology.
The reality is that under Facebook’s privacy policy, and under the law as it stood at the time, it is unlikely that the collection and repurposing of that data was illegal. I understand that the data was collected through an app that was about personality tests, and then re-deployed for election targeting. My understanding of the law is that that was not technically illegal, but I will come on to where I think the crime actually lies.
The right hon. Gentleman’s point makes it clear that the legislation is extremely timely. Does he not agree that that is why we are all here today—to try to improve the current situation?
Absolutely. That is why the European Commission has been working on it for so long. Today’s legislation incorporates a bit of European legislation into British law.
The crime that may have been committed is the international transfer of data. It is highly likely that data collected here in the UK was transferred to the United States and deployed—weaponised, in a way—in a political campaign in the United States. It is not clear that that is legal.
The scandal has knocked about $40 billion off the value of Facebook. I noted with interest that Mr Zuckerberg dumped a whole load of Facebook stock the weekend before the revelations on Monday and Tuesday, and no doubt his shareholders will want to hold him to account for that decision. I read his statement when it finally materialised on Facebook last night, and it concerned me that there was not one word of apology to Facebook users in it. There was an acknowledgement that there had been a massive data breach and a breach of trust, but there was not a single word of apology for what had happened or for Facebook basically facilitating and enabling it. That tells me that we simply will not be able to rely on Facebook self-policing adherence to data protection policies.
The hon. Member for Hornchurch and Upminster is absolutely right—that is why the Bill is absolutely necessary—but the question about the clause is whether the sanctions for misbehaviour are tough enough. Of the two or three things that concerned me most this week, one was how on earth it took the Information Commissioner so long to get the warrant she wanted to search the Cambridge Analytica offices. The Minister may want to say a word about whether that warrant has now been issued. That time lag begs the question whether there is a better way of giving the Information Commissioner the power to conduct such investigations. As we rehearsed in an earlier sitting, the proposed sanctions are financial, but the reality is that many of Cambridge Analytica’s clients are not short of cash—they are not short of loose change—so even the proposed new fines are not necessarily significant enough.
I say that because we know that the companies that contract with organisations such as Cambridge Analytica are often shell companies, so a fine that is cast as a percentage of turnover is not necessarily a sufficient disincentive for people to break the law. That is why I ask the Minister again to consider reviewing the clause and to ask herself, her officials and her Government colleagues whether we should consider a sanction of a custodial sentence where people get in the way of an investigation by the Information Commissioner’s Office.
I am afraid that such activities will continue. I very much hope that the Secretary of State for Digital, Culture, Media and Sport reflects on our exchange on the Floor of the House this morning and uses the information he has about public contracts to do a little more work to expose who is in the network of individuals associated with Cambridge Analytica and where other companies may be implicated in this scandal. We know, because it has said so, that Cambridge Analytica is in effect a shell company—it is in effect a wholly owned subsidiary of SCL Elections Ltd—but we also know that it has an intellectual property sharing agreement with other companies, such as AggregateIQ. Mr Alexander Nix, because he signed the non-disclosure agreement, was aware of that. There are relationships between companies around Cambridge Analytica that extend far and wide. I mentioned this morning that I am concerned that the Foreign and Commonwealth Office may be bringing some of them together for its computational propaganda conference somewhere in the countryside this weekend.
The point I really want the Minister to address is whether she is absolutely content that the sanctions proposed under the clause are sufficient to deter and prosecute the kind of misbehaviour, albeit still only alleged, that has been in the news this week, which raises real concerns.
I will be very brief, because I will largely echo what the right hon. Member for Birmingham, Hodge Hill said. It is absolutely fair to say that our understanding of the potential value of personal information, including that gained by people who break data protection laws, has increased exponentially in recent times, as has our understanding of the damage that can be done to victims of such breaches. I agree that it is not easy to see why the proposed offences stop where they do.
I have a specific question about why there is a two-tier system of penalties. There is a set of offences that are triable only in a summary court and for which there is a maximum fine. I think the maximum in Scotland and Northern Ireland is £5,000. There is a second set of offences that could conceivably be triable on indictment, and there is provision there for an unlimited fine, but not any custodial sentence.
For some companies, if they were in trouble, a £5,000 fine for essentially obstructing justice would be small beer, especially if it allowed them to avoid an unlimited fine. It would be interesting to hear an explanation for that. Many folk would see some of the offences that are triable on indictment as morally equivalent to embezzlement, serious theft or serious fraud, so it is legitimate to ask why there is no option for a custodial sentence in any circumstance.
I certainly share the concerns that hon. Members have expressed in the light of the dreadful Cambridge Analytica scandal. I will set out the penalties for summary only offences, which lie in clause 119, “Inspection of personal data in accordance with international obligations”; clause 173, “Alteration etc of personal data to prevent disclosure”; and paragraph 15(1) of schedule 15, which contains the offence of obstructing the execution of a warrant. The maximum penalty on summary conviction for those offences is an unlimited fine in England and Wales or a level 5 fine in Scotland and Northern Ireland.
Clause 189(2) sets out the maximum penalties for offences that can be tried summarily on indictment, which include offences in clause 132 “Confidentiality of information”; clause 145 “False statements made in response to an information notice”; clause 170 “Unlawful obtaining etc of personal data”; clause 171 “Re-identification of de-identified personal data”; and clause 181 “Prohibition of requirement to produce relevant records”. Again, the maximum penalty when tried summarily in England or Wales, or on indictment, is an unlimited fine. In Scotland and Northern Ireland, the maximum penalty on summary conviction is a fine
“not exceeding the statutory maximum”
of an unlimited fine when tried on indictment.
I was listening carefully to the Minister’s reply. She said that the sanction is an unlimited fine in England and Wales. Let us take the hypothetical case of Cambridge Analytica, which is a one-man shell company, in effect; in the UK, it is wholly owned by SCL Elections. I am concerned about what happens if that holding company—let us say it is SCL Elections—is registered outside England and Wales, in the United States or Uruguay, for example? Will the fine bite on the one-man shell company, Cambridge Analytica? If so, the shell company will just go out of business—the directors will be struck off and that will be the end of it. That is not much of a sanction.
The sanctions are as I outlined. The right hon. Gentleman talks about more complex corporate structures. Later in our proceedings, we will touch on the jurisdiction of the general data protection regulation when it comes to dealing with cross-border situations outside the European Union. Perhaps we can throw some light on what he is saying when we come to that point.
The GDPR strengthens the rights of data subjects over their data, including the important right of consent and what constitutes consent by the data subject to the use and processing of their data. That right must now be clear, robust and unambiguous. That is a key change that will provide some protection in the future.
The right hon. Gentleman should remember that, in addition to data protection laws, other sanctions are available, including prosecution for computer misuse, fraud and, potentially, in the case of the example we have been talking about, electoral laws, depending on the circumstances.
Question put and agreed to.
Clause 189 accordingly ordered to stand part of the Bill.
Clause 190 ordered to stand part of the Bill.
Clause 191
Liability of directors etc
Question proposed, That the clause stand part of the Bill.
The debate presents what is potentially a good opportunity to offer a flow of advice to the Minister, if I might pose my question like this: if a company based in the UK has committed an offence, but its holding company is based somewhere else, in what way will clause 191 bite not on the UK operations, but on the holding company elsewhere?
My reading of the extraterritoriality provisions is that the implementation of GDPR and the sanctions around it may well bite in Europe—we will get on to this issue in the debate on extraterritoriality, as the Minister has said—but where companies are registered in, heaven forbid, various tax havens around the world such as Panama or Belize, will the Information Commissioner be able to, in effect, bring prosecutions that will result in action biting on a director of a holding company domiciled somewhere abroad, such as Belize? That is a pretty plausible scenario. Again, this touches on whether the sanctions in the Bill are sufficient to deter the kind of misbehaviour that we now know is running loose around the wild west that the Secretary of State described.
The clause allows proceedings to be brought against a director, or a person acting in a similar position, as well as the body corporate, where it has been proven that breaches of the Act have occurred with the consent, connivance or negligence of that person. The clause will have the same effect as that of section 61 of the Data Protection Act 1998. I might have to come back to the right hon. Gentleman on some of the points he raised in that hypothetical circumstance, which I have no doubt could certainly exist in the future.
I would be grateful if the Minister wrote to me on that this afternoon, because if there are deficiencies we will have to get on with preparing amendments for consideration on Report.
Question put and agreed to.
Clause 191 accordingly ordered to stand part of the Bill.
Clauses 192 to 195 ordered to stand part of the Bill.
Clause 196
Tribunal Procedure Rules
Question proposed, That the clause stand part of the Bill.
Questions have arisen on the procedure rules associated with tribunals. The Opposition are concerned that the rights conferred in the Bill are rights in reality, not in theory. That is why we moved important amendments earlier, which were unwisely rejected by the Government, on collective forms of class action.
If we are to ensure that our constituents genuinely have access to the kind of justice mechanisms set out in the clause, we are obviously required to confront the reality that people will sometimes not have the resources for the financing of solicitors or representatives to help them to make their cases. Will the Minister say a word about whether our constituents will have access to resources such as legal aid to fight those cases in a tribunal?
The clause provides a power to make tribunal procedure rules to regulate how the rights of appeal before the tribunal and the right to apply for an order from the tribunal, conferred under the Bill, are exercised. It sets out the way a data subject’s right to authorise a representative body to apply for an order on his or her behalf under article 80 of the GDPR and clause 183 can be exercised. For somebody who does not have the means to pursue an individual claim, that is obviously a way forward in some circumstances. In addition, it provides a power to make provision about
“securing the production of material used for the processing of personal data,”
and
“the inspection, examination, operation and testing of equipment or material used in connection with the processing of personal data.”
The provisions are equivalent to paragraph 7 of schedule 6 of the 1998 Act.
That is a helpful explanation. It is obvious from the Minister’s response that those tribunal rules will be incredibly important in providing democratic access to justice where our constituents have been maligned and their data rights abused. The tribunal procedure rules, given what she has said, will be of great interest to right hon. and hon. Members.
Will the Minister clarify what oversight and scrutiny we may have in the House of those tribunal procedure rules, or whether they are purely rules that are the child of the tribunal authorities? Are they something the tribunal authorities can just issue, or is there some oversight, amendment or improvement that we in the House can provide?
I cannot be precise about the level of scrutiny that the tribunal procedure rules may or may not be subject to, but in further answer to the right hon. Gentleman’s earlier question, legal aid is also available, as set out in the Legal Aid, Sentencing and Punishment of Offenders Act 2012, where a failure to fund would breach the European convention on human rights. There is that protection over and above the right of people to join a group action. The rules set by the Tribunal Procedure Rules Committee will be set, I am told, by applying its own consultation process, which the Lord Chancellor lays before Parliament.
Question put and agreed to.
Clause 196 accordingly ordered to stand part of the Bill.
Clause 197 ordered to stand part of the Bill.
Clause 198
Other definitions
Amendments made: 70, in clause 198, page 114, line 25, at end insert
“the following (except in the expression “United Kingdom government department”)”.
This amendment makes clear that the definition of “government department” does not operate on references to a “United Kingdom government department” (which can be found in Clause 185 and paragraph 1 of Schedule 7).
Amendment 71, in clause 198, page 115, line 8, at end insert—
“(2) References in this Act to a period expressed in hours, days, weeks, months or years are to be interpreted in accordance with Article 3 of Regulation (EEC, Euratom) No. 1182/71 of the Council of 3 June 1971 determining the rules applicable to periods, dates and time limits, except in—
(a) section 125(4), (7) and (8);
(b) section 160(3), (5) and (6);
(c) section 176(2);
(d) section 179(8) and (9);
(e) section 180(4);
(f) section 186(3), (5) and (6);
(g) section 190(3) and (4);
(h) paragraph 18(4) and (5) of Schedule1;
(i) paragraphs 5(4) and 6(4) of Schedule3;
(j) Schedule5;
(k) paragraph 11(5) of Schedule12;
(l) Schedule 15;
(and the references in section 5 to terms used in Chapter 2 or 3 of Part 2 do not include references to a period expressed in hours, days, weeks, months or years).”
This amendment provides that periods of time referred to in the bill are generally to be interpreted in accordance with Article 3 of EC Regulation 1182/71, which makes provision about the calculation of periods of hours, days, weeks, months and years.
Amendment 182, in clause 198, page 115, line 8, at end insert—
“( ) Section 3(14)(aa) (interpretation of references to Chapter 2 of Part 2 in Parts 5 to 7) and the amendments in Schedule 18 which make equivalent provision are not to be treated as implying a contrary intention for the purposes of section 20(2) of the Interpretation Act 1978, or any similar provision in another enactment, as it applies to other references to, or to a provision of, Chapter 2 of Part 2 of this Act.” —(Margot James.)
Clause 3(14)(aa) (inserted by amendment 4) and equivalent provision contained in amendments in Schedule 18 state expressly that references to Chapter 2 of Part 2 of the bill in Parts 5 to 7 of the bill, and in certain amendments in Schedule 18, include that Chapter as applied by Chapter 3 of Part 2. This amendment secures that they are not to be treated as implying a contrary intention for the purposes of section 20(2) of the Interpretation Act 1978. Section 20(2) provides that where an Act refers to an enactment that reference includes that enactment as applied, unless the contrary intention appears.
Clause 198, as amended, ordered to stand part of the Bill.
Clause 199 ordered to stand part of the Bill.
Clause 200
Territorial application of this Act
Amendments made: 183, in clause 200, page 117, line 15, leave out subsections (1) to (4) and insert—
‘(1) This Act applies only to processing of personal data described in subsections (2) and (3).
(2) It applies to the processing of personal data in the context of the activities of an establishment of a controller or processor in the United Kingdom, whether or not the processing takes place in the United Kingdom.
(3) It also applies to the processing of personal data to which Chapter 2 of Part 2 (the GDPR) applies where—
(a) the processing is carried out in the context of the activities of an establishment of a controller or processor in a country or territory that is not a member State, whether or not the processing takes place in such a country or territory,
(b) the personal data relates to a data subject who is in the United Kingdom when the processing takes place, and
(c) the processing activities are related to—
(i) the offering of goods or services to data subjects in the United Kingdom, whether or not for payment, or
(ii) the monitoring of data subjects’ behaviour in the United Kingdom.’
This amendment replaces the existing provision on territorial application in clause 200(1) to (4). In the amendment, subsection (2) provides that the bill applies to processing in the context of the activities of an establishment of a controller or processor in the UK. Subsection (3) provides that, in certain circumstances, the bill also applies to processing to which the GDPR applies and which is carried out in the context of activities of an establishment of a controller or processor in a country or territory that is not part of the EU.
Amendment 184, in clause 200, page 118, line 8, leave out “(4)” and insert “(3)”.
This amendment is consequential on amendment 183.
Amendment 185, in clause 200, page 118, leave out line 10 and insert “processing of personal data”.
This amendment is consequential on amendment 183.
Amendment 186, in clause 200, page 118, line 10, at end insert—
‘(5A) Section 3(14)(b) does not apply to the reference to the processing of personal data in subsection (2).
(5B) The reference in subsection (3) to Chapter 2 of Part 2 (the GDPR) does not include that Chapter as applied by Chapter 3 of Part 2 (the applied GDPR).’
New subsection (5A) secures that the reference to “processing” in the new subsection (2) inserted by amendment 183 includes all types of processing of personal data. It disapplies clause 3(14)(b), which provides that references to processing in Parts 5 to 7 of the bill are usually only to processing to which Chapter 2 or 3 of Part 2, Part 3 or Part 4 applies. New subsection (5B) secures that the reference in the new subsection (3) to Chapter 2 of Part 2 of the bill does not include that Chapter as applied by Chapter 3 of Part 2.
Amendment 187, in clause 200, page 118, line 11, leave out “established” and insert “who has an establishment”.
This amendment is consequential on amendment 183.
Amendment 188, in clause 200, page 118, line 21, after “to” insert “a person who has an”.
This amendment is consequential on amendment 183.
Amendment 189, in clause 200, page 118, line 23, leave out subsection (7).—(Margot James.)
This amendment is consequential on amendment 183.
Question proposed, That the clause, as amended, stand part of the Bill.
This is where we get into some of the whys and wherefores of the territorial application of the Bill. We can see in clause 200(1) that the Bill essentially bites on a data controller who is domiciled here in the United Kingdom. A question of public concern—it should also concern us in this Committee—is whether the bite and sanctions of the Bill will touch on people who are registered here, but not necessarily on directors of holding companies who are domiciled elsewhere.
I expect that the things we will learn about over the weekend and into next week will confirm for us all that very small companies—essentially corporate shells—that are perhaps registered as data controllers and might have committed offences under the 1998 Act or under the Bill, once it has received Royal Assent, might be controlled by directors who are domiciled elsewhere. If the Bill is to be worth anything and if it is to change anything in the real world in which we happen to live, there is a real question about how offences committed under it by people here will be limited by the corporate realities, which mean that shell companies are data controllers, but actually the wealth, assets and operating mind of a company are somewhere else. Perhaps the Minister will say a little about how she will tackle that particular problem, because we know it is going to arise.
First, a word on the clause, which sets out the territorial application with respect to the circumstances in which the Bill applies to the processing of personal data. Article 3 of the GDPR says that the GDPR applies where the processing of personal data occurs in the context of the activities of a controller or a processor established in the EU, and that it will also apply where a controller or processor is based outside the EU, but is processing the data of people within the EU in connection with the offering of goods and services to them, or for monitoring their behaviour.
We have revisited the clause to ensure that, as far as possible, the scope of the Bill aligns with the scope of the GDPR, albeit in a UK-only context. The Bill will allow the sanction to be given to an overseas entity where it is in the control of a UK-based company. Whether it can be enforced will depend on international arrangements for bringing people to justice, including those beyond the area of data protection.
One additional point, regarding the global nature of these crimes, is that under UK law we already have stronger data protection laws than many other countries—indeed, considerably stronger than in the United States. That means that American citizens with an interest in this Cambridge Analytica debacle are using the British courts and British legislation to enforce things such as data subject access requests, which have revealed a great deal of the evidence that is coming out of Cambridge Analytica. So we benefit as well from the strength of the data provisions that we have at the moment, which we are of course strengthening through the Bill.
Question put and agreed to.
Clause 200, as amended, accordingly ordered to stand part of the Bill.
Clause 201 ordered to stand part of the Bill.
Clause 202
Application to the Crown
Question proposed, That the clause stand part of the Bill.
I think we would all benefit from a little bit of explanation about how this clause will work in practice. For those who have not read clause 202 in detail, it basically explains how this Bill will operate when it comes to the Crown. That is obviously important, because within Her Majesty’s estates there are particular estates such as the Duchy of Lancaster and indeed the Duchy of Cornwall, which are often quite big businesses. I remember from my own time as Chancellor of the Duchy of Lancaster that there are some quite significant property holdings in that Duchy, and they make a not insignificant contribution to the funds that Her Majesty uses to work with, day to day. How will this clause be put into practice and are there any relevant exemptions that we should know about?
Clause 202 does not contain any provision to exempt the Crown from the requirements of the GDPR. Likewise, section 63 of the 1998 Act also binds the Crown. This clauses makes similar and related provision. For example, where Crown bodies enter into controller-processor relationships with each other, subsection (3) provides that the arrangement may be governed by a memorandum of understanding, rather than a contract. This is to meet the requirements of article 28 of the GDPR. “the data protection legislation section 1261(1)”. “the data protection legislation section 1173(1)”.” “Data Protection Act 2018 Section145 False statements made in response to an information notice””
Question put and agreed to.
Clause 202 accordingly ordered to stand part of the Bill.
Clause 203 ordered to stand part of the bill.
Clause 204
Minor and consequential amendments
Amendment made: 190, in clause 204, page 120, line 12, leave out subsection (1) and insert—
‘(1) In Schedule 18—
(a) Part 1 contains minor and consequential amendments of primary legislation;
(b) Part 2 contains minor and consequential amendments of other legislation;
(c) Part 3 contains consequential modifications of legislation;
(d) Part 4 contains supplementary provision.”
This amendment sets out the contents of Schedule 18 and is consequential on the amendments being made to Schedule 18 including in particular the insertion of new Parts 3 and 4 into that Schedule by amendment 224.—(Margot James.)
Clause 204, as amended, ordered to stand part of the Bill.
Schedule 18
Minor and Consequential Amendments
Amendments made: 191, in schedule 18, page 208, line 25, at end insert—
“Registration Service Act 1953 (c. 37)
A1 (1) Section 19AC of the Registration Service Act 1953 (codes of practice) is amended as follows.
(2) In subsection (2), for “section 52B (data-sharing code) of the Data Protection Act 1998” substitute “section 122 of the Data Protection Act 2018 (data-sharing code)”.
(3) In subsection (11), for “section 51(3) of the Data Protection Act 1998” substitute “section 128 of the Data Protection Act 2018”.
Veterinary Surgeons Act 1966 (c. 36)
A2 (1) Section 1A of the Veterinary Surgeons Act 1966 (functions of the Royal College of Veterinary Surgeons as competent authority) is amended as follows.
(2) In subsection (8)—
(a) omit “personal data protection legislation in the United Kingdom that implements”,
(b) for paragraph (a) substitute—
“(a) the GDPR; and”, and
(c) in paragraph (b), at the beginning insert “legislation in the United Kingdom that implements”.
(3) In subsection (9), after “section” insert “—
“the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.”
This amendment makes consequential amendments to primary legislation.
Amendment 192, in schedule 18, page 210, line 4, at end insert—
“Pharmacy (Northern Ireland) Order 1976 (S.I. 1976/1213 (N.I. 22))
8A The Pharmacy (Northern Ireland) Order 1976 is amended as follows.
8B In article 2(2) (interpretation), omit the definition of “Directive 95/46/EC”.
8C In article 8D (European professional card), after paragraph (3) insert—
“(4) In Schedule 2C, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018.”
8D In article 22A(6) (Directive 2005/36/EC: functions of competent authority etc.), before sub-paragraph (a) insert—
“(za) “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
8E (1) Schedule 2C (Directive 2005/36/EC: European professional card) is amended as follows.
(2) In paragraph 8(1) (access to data), for “Directive 95/46/EC” substitute “the GDPR”.
(3) In paragraph 9 (processing data), omit sub-paragraph (2) (deeming the Society to be the controller for the purposes of Directive 95/46/EC).
8F (1) The table in Schedule 2D (functions of the Society under Directive 2005/36/EC) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
8G (1) Paragraph 2 of Schedule 3 (fitness to practice: disclosure of information) is amended as follows.
(2) In sub-paragraph (2)(a), after “provision” insert “or the GDPR”.
(3) For sub-paragraph (3) substitute—
“(3) In determining for the purposes of sub-paragraph (2)(a) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this paragraph.”
(4) After sub-paragraph (4) insert—
“(5) In this paragraph, “the GDPR” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(10), (11) and (14) of that Act).”
Representation of the People Act 1983 (c. 2)
8H (1) Schedule 2 to the Representation of the People Act 1983 (provisions which may be contained in regulations as to registration etc) is amended as follows.
(2) In paragraph 1A(5), for “the Data Protection Act 1998” substitute “Parts 5 to 7 of the Data Protection Act 2018 (see section 3(4) and (14) of that Act)”.
(3) In paragraph 8C(2), for “the Data Protection Act 1998” substitute “Parts 5 to 7 of the Data Protection Act 2018 (see section 3(4) and (14) of that Act)”.
(4) In paragraph 11A—
(a) in sub-paragraph (1) for “who are data users to supply data, or documents containing information extracted from data and” substitute “to supply information”, and
(b) omit sub-paragraph (2).”
This amendment makes consequential amendments to primary legislation.
Amendment 193, in schedule 18, page 210, leave out lines 5 to 39 and insert—
“Medical Act 1983 (c. 54)
9 The Medical Act 1983 is amended as follows.
10 (1) Section 29E (evidence) is amended as follows.
(2) In subsection (5), after “enactment” insert “or the GDPR”.
(3) For subsection (7) substitute—
“(7) In determining for the purposes of subsection (5) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this section.”
(4) In subsection (9), at the end insert—
““the GDPR” and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act).”
11 (1) Section 35A (General Medical Council’s power to require disclosure of information) is amended as follows.
(2) In subsection (4), after “enactment” insert “or the GDPR”.
(3) For subsection (5A) substitute—
“(5A) In determining for the purposes of subsection (4) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this section.”
(4) In subsection (7), at the end insert—
““the GDPR” and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act).”
12 In section 49B(7) (Directive 2005/36: designation of competent authority etc.), after “Schedule 4A” insert “—
“the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
13 In section 55(1) (interpretation), omit the definition of “Directive 95/46/EC”.
13A (1) Paragraph 9B of Schedule 1 (incidental powers of the General Medical Council) is amended as follows.
(2) In sub-paragraph (2)(a), after “enactment” insert “or the GPDR”.
(3) After sub-paragraph (3) insert—
“(4) In this paragraph, “the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(10), (11) and (14) of that Act).”
13B (1) Paragraph 5A of Schedule 4 (professional performance assessments and health assessments) is amended as follows.
(2) In sub-paragraph (8), after “enactment” insert “or the GDPR”.
(3) For sub-paragraph (8A) substitute—
“(8A) In determining for the purposes of sub-paragraph (8) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this paragraph.”
(4) After sub-paragraph (13) insert—
“(14) In this paragraph, “the GDPR” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(10), (11) and (14) of that Act).”
13C (1) The table in Schedule 4A (functions of the General Medical Council as competent authority under Directive 2005/36) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.”
This amendment replaces the existing consequential amendments of the Medical Act 1983.
Amendment 194, in schedule 18, page 211, line 18, leave out from “GDPR”” to “(see” in line 19 and insert “and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 33B of the Dentists Act 1984 references to Schedule 2 to the bill include that Schedule as applied by Chapter 3 of Part 2 of the bill.
Amendment 195, in schedule 18, page 211, line 20, at end insert—
15A In section 36ZA(6) (Directive 2005/36: designation of competent authority etc), after “Schedule 4ZA—” insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.”
This amendment makes further consequential amendments to the Dentists Act 1984.
Amendment 196, in schedule 18, page 211, line 39, leave out from “GDPR”” to “(see” in line 40 and insert “and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 36Y of the Dentists Act 1984 references to Schedule 2 to the bill include that Schedule as applied by Chapter 3 of Part 2 of the bill.
Amendment 197, in schedule 18, page 211, line 41, at end insert—
16A In section 53(1) (interpretation), omit the definition of “Directive 95/46/EC”.
16B (1) The table in Schedule 4ZA (Directive 2005/36: functions of the General Dental Council under section 36ZA(3)) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
Companies Act 1985 (c. 6)
16C In section 449(11) of the Companies Act 1985 (provision for security of information obtained), for “the Data Protection Act 1998” substitute “the data protection legislation”.”
This amendment makes consequential amendments to primary legislation, including further consequential amendments to the Dentists Act 1984.
Amendment 198, in schedule 18, page 212, line 16, leave out from “GDPR”” to “(see” in line 17 and insert “and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 13B of the Opticians Act 1989 references to Schedule 2 to the bill include that Schedule as applied by Chapter 3 of Part 2 of the bill.
Amendment 199, in schedule 18, page 212, line 18, at end insert—
“Access to Health Records Act 1990 (c. 23)
18A The Access to Health Records Act 1990 is amended as follows.
18B For section 2 substitute—
“2 Health professionals
In this Act, “health professional” has the same meaning as in the Data Protection Act 2018 (see section 197 of that Act).”
18C (1) Section 3 (right of access to health records) is amended as follows.
(2) In subsection (2), omit “Subject to subsection (4) below,”.
(3) In subsection (4), omit from “other than the following” to the end.”
This amendment makes consequential amendments to the Access to Health Records Act 1990.
Amendment 200, in schedule 18, page 213, line 2, at end insert—
“Industrial Relations (Northern Ireland) Order 1992 (S.I. 1992/807 (N.I. 5))
21A (1) Article 90B of the Industrial Relations (Northern Ireland) Order 1992 (prohibition on disclosure of information held by the Labour Relations Agency) is amended as follows.
(2) In paragraph (3), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) After paragraph (6) insert—
“(7) In this Article, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).””
This amendment makes consequential amendments to the Industrial Relations (Northern Ireland) Order 1992.
Amendment 201, in schedule 18, page 216, line 10, leave out from “data”” to “(see” in line 11 and insert “, “processing” and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 40 of the Freedom of Information Act 2000 references to a provision of Chapter 2 of Part 2 of the bill include that provision as applied by Chapter 3 of Part 2 of the bill.
Amendment 202, in schedule 18, page 219, line 15, leave out from “GDPR”” to “(see” in line 16 and insert “and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 7A of the Health and Personal Social Services Act (Northern Ireland) 2001 references to Schedule 2 to the bill include that Schedule as applied by Chapter 3 of Part 2 of the bill.
Amendment 203, in schedule 18, page 220, line 7, at end insert—
“Enterprise Act 2002 (c. 40)
64A (1) Section 237 of the Enterprise Act 2002 (general restriction on disclosure) is amended as follows.
(2) In subsection (4), for “the Data Protection Act 1998 (c. 29)” substitute “the data protection legislation”.
(3) After subsection (6) insert—
“(7) In this section, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).””
This amendment makes consequential amendments to the Enterprise Act 2002.
Amendment 204, in schedule 18, page 221, line 21, leave out from “data”” to “(see” in line 22 and insert “, “processing” and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in section 38 of the Freedom of Information (Scotland) Act 2002 references to a provision of Chapter 2 of Part 2 of the bill include that provision as applied by Chapter 3 of Part 2 of the bill.
Amendment 205, in schedule 18, page 222, line 21, at end insert—
“Mental Health (Care and Treatment) (Scotland) Act 2003 (asp 13)
75A (1) Section 279 of the Mental Health Care and Treatment (Scotland) Act 2003 (information for research) is amended as follows.
(2) In subsection (2), for “research purposes within the meaning given by section 33 of the Data Protection Act 1998 (c. 29) (research, history and statistics)” substitute “purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics)”.
(3) After subsection (9) insert—
“(10) In this section, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).””
This amendment makes consequential amendments to the Mental Health (Care and Treatment) (Scotland) Act 2003.
Amendment 206, in schedule 18, page 222, line 29, at end insert—
“Companies (Audit, Investigations and Community Enterprise) Act 2004 (c. 27)
76A The Companies (Audit, Investigations and Community Enterprise) Act 2004 is amended as follows.
76B (1) Section 15A (disclosure of information by tax authorities) is amended as follows.
(2) In subsection (2)—
(a) omit “within the meaning of the Data Protection Act 1998”, and
(b) for “that Act” substitute “the data protection legislation”.
(3) After subsection (7) insert—
“(8) In this section—
“the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act);
“personal data” has the same meaning as in Parts 5 to 7 of that Act (see section3(2) and (14) of that Act).”
76C (1) Section 15D (permitted disclosure of information obtained under compulsory powers) is amended as follows.
(2) In subsection (7), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) After subsection (7) insert—
“(8) In this section, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).””
This amendment makes consequential amendments to the Companies (Audit, Investigations and Community Enterprise) Act 2004.
Amendment 207, in schedule 18, page 225, line 10, at end insert—
88A (1) Section 264C (provision and disclosure of information about health service products: supplementary) is amended as follows.
(2) In subsection (2), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) After subsection (3) insert—
(4) In this section, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).””
This amendment makes further consequential amendments to the National Health Service Act 2006.
Amendment 208, in schedule 18, page 225, line 28 at end insert—
“Companies Act 2006 (c. 46)
92A The Companies Act 2006 is amended as follows.
92B In section 458(2) (disclosure of information by tax authorities)—
(a) for “within the meaning of the Data Protection Act 1998 (c. 29)” substitute “within the meaning of Parts 5 to 7 of the Data Protection Act 2018 (see section 3(2) and (14) of that Act)”, and
(b) for “that Act” substitute “the data protection legislation”.
92C In section 461(7) (permitted disclosure of information obtained under compulsory powers), for “the Data Protection Act 1998 (c. 29)” substitute “the data protection legislation”.
92D In section 948(9) (restrictions on disclosure) for “the Data Protection Act 1998 (c. 29)” substitute “the data protection legislation”.
92E In section 1173(1) (minor definitions: general), at the appropriate place insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”.
92F In section 1224A(7) (restrictions on disclosure), for “the Data Protection Act 1998” substitute “the data protection legislation”.
92G In section 1253D(3) (restriction on transfer of audit working papers to third countries), for “the Data Protection Act 1998” substitute “the data protection legislation”.
92H In section 1261(1) (minor definitions: Part 42), at the appropriate place insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”.
92I In section 1262 (index of defined expressions: Part 42), at the appropriate place insert—
92J In Schedule 8 (index of defined expressions: general), at the appropriate place insert—
This amendment makes consequential amendments to the Companies Act 2006.
Amendment 209, in schedule 18, page 225, line 38, at end insert—
96A (1) Section 45 (information held by HMRC) is amended as follows.
(2) In subsection (4A), for “section 51(3) of the Data Protection Act 1998” substitute “section 128 of the Data Protection Act 2018”.
(3) In subsection (4B), for “the Data Protection Act 1998” substitute “the Data Protection Act 2018”.”
This amendment makes further consequential amendments to the Statistics and Registration Service Act 2007.
Amendment 210, in schedule 18, page 230, line 16, at end insert—
“Coroners and Justice Act 2009 (c. 25)
122A In Schedule 21 of the Coroners and Justice Act 2009 (minor and consequential amendments), omit paragraph 29(3).”
This amendment makes a consequential amendment to the Coroners and Justice Act 2009 and is consequential on the amendments being made to section 3 of the Access to Health Records Act 1990 by amendment 199.
Amendment 211, in schedule 18, page 232, line 39, after “after “” insert “this”
Paragraph 130(3) of Schedule 18 to the bill amends paragraph 8(8) of Schedule 2 to the Welsh Language (Wales) Measure 2011 by inserting new text. This amendment clarifies where that new text is to be inserted in the English language version of that Measure.
Amendment 212, in schedule 18, page 242, line 40, at end insert—
“Additional Learning Needs and Educational Tribunal (Wales) Act 2018 (anaw 2)
186A (1) Section 4 of the Additional Learning Needs and Educational Tribunal (Wales) Act 2018 (additional learning needs code) is amended as follows.
(2) In the English language text—
(a) in subsection (9), omit from “and in this subsection” to the end, and
(b) after subsection (9) insert—
“(9A) In subsection (9)—
“data subject” (“testun y data”) has the meaning given by section3(5) of the Data Protection Act 2018;
“personal data” (“data personol”) has the same meaning as in Parts 5 to 7 of that Act (see section3(2) and (14) of that Act).”
(3) In the Welsh language text—
(a) in subsection (9), omit from “ac yn yr is-adran hon” to the end, and
(b) after subsection (9) insert—
“(9A) Yn is-adran (9)—
mae i “data personol” yr un ystyr ag a roddir i “personal data” yn Rhannau 5 i 7 o Ddeddf Diogelu Data 2018 (gweler adran3(2) a (14) o’r Ddeddf honno);
mae i “testun y data” yr ystyr a roddir i “data subject” gan adran3(5) o’r Ddeddf honno.”
This amendment makes consequential amendments to the Additional Learning Needs and Educational Tribunal (Wales) Act 2018.
Amendment 213, in schedule 18, page 243, line 14, at end insert—
“Estate Agents (Specific Offences) (No. 2) Order 1991 (S.I. 1991/1091)
187A In the table in the Schedule to the Estate Agents (Specified Offences) (No. 2) Order 1991 (specified offences), at the end insert—
This amendment makes a consequential amendment to the Estate Agents (Specific Offences) (No. 2) Order 1991.
Amendment 214, in schedule 18, page 243, line 22, after “controller”,” insert—
(ba) after “in the context of” insert “the activities of”,”
This amendment to the consequential amendment to the Channel Tunnel (International Agreements) Order 1993 is consequential on amendment 183.
Amendment 215, in schedule 18, page 243, line 27, after “controller”,” insert—
(ba) after “in the context of” insert “the activities of”,”
This amendment to the consequential amendment to the Channel Tunnel (International Agreements) Order 1993 is consequential on amendment 183.
Amendment 216, in schedule 18, page 243, line 28, at end insert—
“Access to Health Records (Northern Ireland) Order 1993 (S.I. 1993/1250 (N.I. 4))
188A The Access to Health Records (Northern Ireland) Order 1993 is amended as follows.
188B In Article 4 (health professionals), for paragraph (1) substitute—
“(1) In this Order, “health professional” has the same meaning as in the Data Protection Act 2018 (see section 197 of that Act).”
188C In Article 5(4)(a) (fees for access to health records), for “under section 7 of the Data Protection Act 1998” substitute “made by the Department”.
Channel Tunnel (Miscellaneous Provisions) Order 1994 (S.I. 1994/1405)
188D In article 4 of the Channel Tunnel (Miscellaneous Provisions) Order 1994 (application of enactments), for paragraphs (2) and (3) substitute—
“(2) For the purposes of section 200 of the Data Protection Act 2018 (“the 2018 Act”), data which is processed in a control zone in Belgium, in connection with the carrying out of frontier controls, by an officer belonging to the United Kingdom is to be treated as processed by a controller established in the United Kingdom in the context of the activities of that establishment (and accordingly the 2018 Act applies in respect of such data).
(3) For the purposes of section 200 of the 2018 Act, data which is processed in a control zone in Belgium, in connection with the carrying out of frontier controls, by an officer belonging to the Kingdom of Belgium is to be treated as processed by a controller established in the Kingdom of Belgium in the context of the activities of that establishment (and accordingly the 2018 Act does not apply in respect of such data).”
European Primary and Specialist Dental Qualifications Regulations 1998 (S.I. 1998/811)
188E The European Primary and Specialist Dental Qualifications Regulations 1998 are amended as follows.
188F (1) Regulation 2(1) (interpretation) is amended as follows.
(2) Omit the definition of “Directive 95/46/EC”.
(3) At the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
188G (1) The table in Schedule A1 (functions of the GDC under Directive 2005/36) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
Scottish Parliamentary Corporate Body (Crown Status) Order 1999 (S.I. 1999/677)
188H For article 7 of the Scottish Parliamentary Corporate Body (Crown Status) Order 1999 substitute—
“7 Data Protection Act 2018
(1) The Parliamentary corporation is to be treated as a Crown body for the purposes of the Data Protection Act 2018 to the extent specified in this article.
(2) The Parliamentary corporation is to be treated as a government department for the purposes of the following provisions—
(a) section8(d) (lawfulness of processing under the GDPR: public interest etc),
(b) section202 (application to the Crown),
(c) paragraph 6 of Schedule1 (statutory etc and government purposes),
(d) paragraph 7 of Schedule2 (exemptions from the GDPR: functions designed to protect the public etc), and
(e) paragraph 8(1)(o) of Schedule3 (exemptions from the GDPR: health data).
(3) In the provisions mentioned in paragraph (4)—
(a) references to employment by or under the Crown are to be treated as including employment as a member of staff of the Parliamentary corporation, and
(b) references to a person in the service of the Crown are to be treated as including a person so employed.
(4) The provisions are—
(a) section24(3) (exemption for certain data relating to employment under the Crown), and
(b) section202(6) (application of certain provisions to a person in the service of the Crown).
(5) In this article, references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(14) of that Act).”
Northern Ireland Assembly Commission (Crown Status) Order 1999 (S.I. 1999/3145)
188I For article 9 of the Northern Ireland Assembly Commission (Crown Status) Order 1999 substitute—
“9 Data Protection Act 2018
(1) The Commission is to be treated as a Crown body for the purposes of the Data Protection Act 2018 to the extent specified in this article.
(2) The Commission is to be treated as a government department for the purposes of the following provisions—
(a) section8(d) (lawfulness of processing under the GDPR: public interest etc),
(b) section202 (application to the Crown),
(c) paragraph 6 of Schedule1 (statutory etc and government purposes),
(d) paragraph 7 of Schedule2 (exemptions from the GDPR: functions designed to protect the public etc), and
(e) paragraph 8(1)(o) of Schedule3 (exemptions from the GDPR: health data).
(3) In the provisions mentioned in paragraph (4)—
(a) references to employment by or under the Crown are to be treated as including employment as a member of staff of the Commission, and
(b) references to a person in the service of the Crown are to be treated as including a person so employed.
(4) The provisions are—
(a) section24(3) (exemption for certain data relating to employment under the Crown), and
(b) section202(6) (application of certain provisions to a person in the service of the Crown).
(5) In this article, references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(14) of that Act).”
Representation of the People (England and Wales) Regulations 2001 (S.I. 2001/341)
188J The Representation of the People (England and Wales) Regulations 2001 are amended as follows.
188K In regulation 3(1) (interpretation), at the appropriate places insert—
““Article 89 GDPR purposes” means the purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”;
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”;
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.
188L In regulation 26(3)(a) (applications for registration), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188M In regulation 26A(2)(a) (application for alteration of register in respect of name under section 10ZD), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188N In regulation 32ZA(3)(f) (annual canvass), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188O In regulation 61A (conditions on the use, supply and inspection of absent voter records or lists), for paragraph (a) (but not the final “or”) substitute—
(a) Article 89 GDPR purposes;”.
188P (1) Regulation 92(2) (interpretation and application of Part VI etc) is amended as follows.
(2) After sub-paragraph (b) insert—
“(ba) “relevant requirement” means the requirement under Article 89 of the GDPR, read with section 19 of the Data Protection Act 2018, that personal data processed for Article 89 GDPR purposes must be subject to appropriate safeguards.”
(3) Omit sub-paragraphs (c) and (d).
188Q In regulation 96(2A)(b)(i) (restriction on use of the full register), for “section 11(3) of the Data Protection Act 1998” substitute “section123(5) of the Data Protection Act 2018”.
188R In regulation 97(5) and (6) (supply of free copy of full register to the British Library and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188S In regulation 97A(7) and (8) (supply of free copy of full register to the National Library of Wales and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188T In regulation 99(6) and (7) (supply of free copy of full register etc to Statistics Board and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188U In regulation 109A(9) and (10) (supply of free copy of full register to public libraries and local authority archives services and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188V In regulation 119(2) (conditions on the use, supply and disclosure of documents open to public inspection), for sub-paragraph (i) (but not the final “or”) substitute—
(i) Article 89 GDPR purposes;”.
Representation of the People (Scotland) Regulations 2001 (S.I. 2001/ 497)
188W The Representation of the People (Scotland) Regulations 2001 are amended as follows.
188X In regulation 3(1) (interpretation), at the appropriate places, insert—
““Article 89 GDPR purposes” means the purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”;
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”;
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.
188Y In regulation 26(3)(a) (applications for registration), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188Z In regulation 26A(2)(a) (application for alteration of register in respect of name under section 10ZD), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188AA In regulation 32ZA(3)(f) (annual canvass), for “the Data Protection Act 1998” substitute “the data protection legislation”.
188AB In regulation 61(3) (records and lists kept under Schedule 4), for paragraph (a) (but not the final “or”) substitute—
(a) Article 89 GDPR purposes;”.
188AC In regulation 61A (conditions on the use, supply and inspection of absent voter records or lists), for paragraph (a) (but not the final “or”) substitute—
(a) Article 89 GDPR purposes;”.
188AD (1) Regulation 92(2) (interpretation of Part VI etc) is amended as follows.
(2) After sub-paragraph (b) insert—
“(ba) “relevant requirement” means the requirement under Article 89 of the GDPR, read with section19 of the Data Protection Act 2018, that personal data processed for Article 89 GDPR purposes must be subject to appropriate safeguards.”
(3) Omit sub-paragraphs (c) and (d).
188AE In regulation 95(3)(b)(i) (restriction on use of the full register), for “section 11(3) of the Data Protection Act 1998” substitute “section123(5) of the Data Protection Act 2018”.
188AF In regulation 96(5) and (6) (supply of free copy of full register to the National Library of Scotland and the British Library and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188AG In regulation 98(6) and (7) (supply of free copy of full register etc to Statistics Board and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188AH In regulation 108A(9) and (10) (supply of full register to statutory library authorities and local authority archives services and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
188AI In regulation 119(2) (conditions on the use, supply and disclosure of documents open to public inspection), for sub-paragraph (i) (but not the final “or”) substitute—
(i) Article 89 GDPR purposes;”.
Financial Services and Markets Act 2000 (Disclosure of Confidential Information) Regulations 2001 (S.I. 2001/2188)
188AJ (1) Article 9 of the Financial Services and Markets 2000 (Disclosure of Confidential Information) Regulations 2001 (disclosure by regulators or regulator workers to certain other persons) is amended as follows.
(2) In paragraph (2B), for sub-paragraph (a) substitute—
“(a) the disclosure is made in accordance with Chapter V of the GDPR;”.
(3) After paragraph (5) insert—
“(6) In this article, “the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(10), (11) and (14) of that Act).”
Nursing and Midwifery Order 2001 (S.I. 2002/253)
188AK The Nursing and Midwifery Order 2001 is amended as follows.
188AL (1) Article 3 (the Nursing and Midwifery Council and its Committees) is amended as follows.
(2) In paragraph (18), after “enactment” insert “or the GDPR”.
(3) After paragraph (18) insert—
“(19) In this paragraph, “the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(10), (11) and (14) of that Act).”
188AM (1) Article 25 (the Council’s power to require disclosure of information) is amended as follows.
(2) In paragraph (3), after “enactment” insert “or the GDPR”.
(3) In paragraph (6)—
(a) for “paragraph (5),” substitute “paragraph (3)—”, and
(b) at the appropriate place insert—
““the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(10), (11) and (14) of that Act).”
188AN In article 39B (European professional card), after paragraph (2) insert—
“(3) For the purposes of Schedule 2B, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018.”
188AO In article 40(6) (Directive 2005/36/EC: designation of competent authority etc), at the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
188AP (1) Schedule 2B (Directive 2005/36/EC: European professional card) is amended as follows.
(2) In paragraph 8(1) (access to data) for “Directive 95/46/EC” substitute “the GDPR”.
(3) In paragraph 9 (processing data), omit sub-paragraph (2) (deeming the Society to be the controller for the purposes of Directive 95/46/EC).
188AQ (1) The table in Schedule 3 (functions of the Council under Directive 2005/36) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
188AR In Schedule 4 (interpretation), omit the definition of “Directive 95/46/EC”.
Electronic Commerce (EC Directive) Regulations 2002 (S.I. 2002/2013)
188AS Regulation 3 of the Electronic Commerce (EC Directive) Regulations 2002 (exclusions) is amended as follows.
188AT In paragraph (1)(b) for “the Data Protection Directive and the Telecommunications Data Protection Directive” substitute “the GDPR”.
188AU In paragraph (3)—
(a) omit the definitions of “Data Protection Directive” and “Telecommunications Data Protection Directive”, and
(b) at the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.”
This amendment makes consequential amendments to secondary legislation, including to the Scottish Parliamentary Corporate Body (Crown Status) Order 1999 and the Northern Ireland Assembly Commission (Crown Status) Order 1999.
Amendment 217, in schedule 18, page 244, line 1, at end insert—
(d) for “data controller” substitute “controller”, and
(e) after “in the context of” insert “the activities of”.
Pupils’ Educational Records (Scotland) Regulations 2003 (S.S.I. 2003/581)
191A The Pupils’ Educational Records (Scotland) Regulations 2003 are amended as follows.
191B (1) Regulation 2 (interpretation) is amended as follows.
(2) Omit the definition of “the 1998 Act”.
(3) At the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
191C (1) Regulation 6 (circumstances where information should not be disclosed) is amended as follows.
(2) After “any information” insert “to the extent that any of the following conditions are satisfied”.
(3) For paragraphs (a) to (c) substitute—
(aa) the pupil to whom the information relates would have no right of access to the information under the GDPR;
(ab) the information is personal data described in Article 9(1) or 10 of the GDPR (special categories of personal data and personal data relating to criminal convictions and offences);”.
(4) In paragraph (d), for “to the extent that its disclosure” substitute “the disclosure of the information”.
(5) In paragraph (e), for “that” substitute “the information”.
191D In regulation 9 (fees), for paragraph (1) substitute—
“(1A) In complying with a request made under regulation 5(2), the responsible body may only charge a fee where Article 12(5) or Article 15(3) of the GDPR would permit the charging of a fee if the request had been made by the pupil to whom the information relates under Article 15 of the GDPR.
(1B) Where paragraph (1A) permits the charging of a fee, the responsible body may not charge a fee that—
(a) exceeds the cost of supply, or
(b) exceeds any limit in regulations made under section 12 of the Data Protection Act 2018 that would apply if the request had been made by the pupil to whom the information relates under Article 15 of the GDPR.”
European Parliamentary Elections (Northern Ireland) Regulations 2004 (S.I. 2004/1267)
191E Schedule 1 to the European Parliamentary Elections (Northern Ireland) Regulations 2004 (European Parliamentary elections rules) is amended as follows.
191F (1) Paragraph 74(1) (interpretation) is amended as follows.
(2) Omit the definitions of “relevant conditions” and “research purposes”.
(3) At the appropriate places insert—
““Article 89 GDPR purposes” means the purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”;
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.
191G In paragraph 77(2)(b) (conditions on the use, supply and disclosure of documents open to public inspection), for “research purposes” substitute “Article 89 GDPR purposes”.”
This amendment makes consequential amendments to secondary legislation, including to the Nationality, Immigration and Asylum Act 2002 (Juxtaposed Controls) Order 2003. The amendment to that Order is consequential on amendment 183, and also changes the reference in article 11(4) of that Order to a “data controller” to a “controller”.
Amendment 218, in schedule 18, page 244, line 13, leave out from “GDPR”” to “(see” in line 14 and insert “and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in the Environmental Information Regulations 2004 references to a provision of Chapter 2 of Part 2 of the bill include that provision as applied by Chapter 3 of Part 2 of the bill.
Amendment 219, in schedule 18, page 246, line 31, leave out from “GDPR”” to “(see” in line 32 and insert “and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in the Environmental Information (Scotland) Regulations 2004 references to a provision of Chapter 2 of Part 2 of the bill include that provision as applied by Chapter 3 of Part 2 of the bill.
Amendment 220, in schedule 18, page 247, line 40, at end insert—
“Licensing Act 2003 (Personal Licences) Regulations 2005 (S.I. 2005/41)
199A (1) Regulation 7 of the Licensing Act 2003 (Personal Licences) Regulations 2005 (application for grant of a personal licence) is amended as follows.
(2) In paragraph (1)(b)—
(a) for paragraph (iii) (but not the final “, and”) substitute—
“(iii) the results of a request made under Article 15 of the GDPR or section45 of the Data Protection Act 2018 (rights of access by the data subject) to the National Identification Service for information contained in the Police National Computer”, and
(b) in the words following paragraph (iii), omit “search”.
(3) After paragraph (2) insert—
“(3) In this regulation, “the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(10), (11) and (14) of that Act).”
Education (Pupil Information) (England) Regulations 2005 (S.I. 2005/1437)
199B The Education (Pupil Information) (England) Regulations 2005 are amended as follows.
199C In regulation 3(5) (meaning of educational record) for “section 1(1) of the Data Protection Act 1998” substitute “section3(4) of the Data Protection Act 2018”.
199D (1) Regulation 5 (disclosure of curricular and educational records) is amended as follows.
(2) In paragraph (4)—
(a) in sub-paragraph (a), for “the Data Protection Act 1998” substitute “the GDPR”, and
(b) in sub-paragraph (b), for “that Act or by virtue of any order made under section 30(2) or section 38(1) of the Act” substitute “the GDPR”.
(3) After paragraph (6) insert—
“(7) In this regulation, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018.””
This amendment makes consequential amendments to secondary legislation.
Amendment 221, in schedule 18, page 248, line 37, leave out from “GDPR”” to “(see” in line 38 and insert “and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act”
This amendment makes clear that in regulation 45 of the Civil Contingencies Act 2004 (Contingency Planning) Regulations 2005 references to a provision of Chapter 2 of Part 2 of the bill include that provision as applied by Chapter 3 of Part 2 of the bill.
Amendment 222, in schedule 18, page 249, line 1, at end insert—
“Register of Judgments, Orders and Fines Regulations 2005 (S.I. 2005/3595)
200A In regulation 3 of the Register of Judgments, Orders and Fines Regulations 2005 (interpretation)—
(a) for the definition of “data protection principles” substitute—
““data protection principles” means the principles set out in Article 5(1) of the GDPR;”, and
(b) at the appropriate place insert—
““the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(10), (11) and (14) of that Act);”.
Civil Contingencies Act 2004 (Contingency Planning) (Scotland) Regulations 2005 (S.S.I. 2005/494)
200B The Civil Contingencies Act 2004 (Contingency Planning) (Scotland) Regulations 2005 are amended as follows.
200C (1) Regulation 39 (sensitive information) is amended as follows.
(2) In paragraph (1)(d)—
(a) omit “, within the meaning of section 1(1) of the Data Protection Act 1998”, and
(b) for “(2) or (3)” substitute “(1A), (1B) or (1C)”.
(3) After paragraph (1) insert—
“(1A) The condition in this paragraph is that the disclosure of the information to a member of the public—
(a) would contravene any of the data protection principles, or
(b) would do so if the exemptions in section24(1) of the Data Protection Act 2018 (manual unstructured data held by public authorities) were disregarded.
(1B) The condition in this paragraph is that the disclosure of the information to a member of the public would contravene—
(a) Article 21 of the GDPR (general processing: right to object to processing), or
(b) section99 of the Data Protection Act 2018 (intelligence services processing: right to object to processing).
(1C) The condition in this paragraph is that—
(a) on a request under Article 15(1) of the GDPR (general processing: right of access by the data subject) for access to personal data, the information would be withheld in reliance on provision made by or under section15,16 or26 of, or Schedule2,3 or4 to, the Data Protection Act 2018,
(b) on a request under section45(1)(b) of that Act (law enforcement processing: right of access by the data subject), the information would be withheld in reliance on subsection (4) of that section, or
(c) on a request under section94(1)(b) of that Act (intelligence services processing: rights of access by the data subject), the information would be withheld in reliance on a provision of Chapter 6 of Part 4 of that Act.
(1D) In this regulation—
“the data protection principles” means the principles set out in—
(a) Article 5(1) of the GDPR,
(b) section34(1) of the Data Protection Act 2018, and
(c) section85(1) of that Act;
“data subject” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);
“the GDPR” and references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act);
“personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(2) and (14) of that Act).
(1E) In determining for the purposes of this regulation whether the lawfulness principle in Article 5(1)(a) of the GDPR would be contravened by the disclosure of information, Article 6(1) of the GDPR (lawfulness) is to be read as if the second sub-paragraph (disapplying the legitimate interests gateway in relation to public authorities) were omitted.”
(4) Omit paragraphs (2) to (4).
National Assembly for Wales (Representation of the People) Order 2007 (S.I. 2007/236)
200D (1) Paragraph 14 of Schedule 1 to the National Assembly for Wales (Representation of the People) Order 2007 (absent voting at Assembly elections: conditions on the use, supply and inspection of absent vote records or lists) is amended as follows.
(2) The existing text becomes sub-paragraph (1).
(3) For paragraph (a) of that sub-paragraph (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”.
(4) After that sub-paragraph insert—
“(2) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Mental Capacity Act 2005 (Loss of Capacity during Research Project) (England) Regulations 2007 (S.I. 2007/679)
200E In regulation 3 of the Mental Capacity Act 2005 (Loss of Capacity during Research Project) (England) Regulations 2007 (research which may be carried out despite a participant’s loss of capacity), for paragraph (b) substitute—
“(b) any material used consists of or includes human cells or human DNA,”.
National Assembly for Wales Commission (Crown Status) Order 2007 (S.I. 2007/1118)
200F For article 5 of the National Assembly for Wales Commission (Crown Status) Order 2007 substitute—
“5 Data Protection Act 2018
(1) The Assembly Commission is to be treated as a Crown body for the purposes of the Data Protection Act 2018 to the extent specified in this article.
(2) The Assembly Commission is to be treated as a government department for the purposes of the following provisions—
(a) section 8(d) (lawfulness of processing under the GDPR: public interest etc),
(b) section202 (application to the Crown),
(c) paragraph 6 of Schedule1 (statutory etc and government purposes),
(d) paragraph 7 of Schedule2 (exemptions from the GDPR: functions designed to protect the public etc), and
(e) paragraph 8(1)(o) of Schedule3 (exemptions from the GDPR: health data).
(3) In the provisions mentioned in paragraph (4)—
(a) references to employment by or under the Crown are to be treated as including employment as a member of staff of the Assembly Commission, and
(b) references to a person in the service of the Crown are to be treated as including a person so employed.
(4) The provisions are—
(a) section24(3) (exemption for certain data relating to employment under the Crown), and
(b) section202(6) (application of certain provisions to a person in the service of the Crown).
(5) In this article, references to a provision of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(14) of that Act).”
Mental Capacity Act 2005 (Loss of Capacity during Research Project) (Wales) Regulations 2007 (S.I. 2007/837 (W.72))
200G In regulation 3 of the Mental Capacity Act 2005 (Loss of Capacity during Research Project) (Wales) Regulations 2007 (research which may be carried out despite a participant’s loss of capacity) —
(a) in the English language text, for paragraph (c) substitute—
“(c) any material used consists of or includes human cells or human DNA; and”, and
(b) in the Welsh language text, for paragraph (c) substitute—
“(c) os yw unrhyw ddeunydd a ddefnyddir yn gelloedd dynol neu’n DNA dynol neu yn eu cynnwys; ac”.
Representation of the People (Absent Voting at Local Elections) (Scotland) Regulations 2007 (S.S.I. 2007/170)
200H (1) Regulation 18 of the Representation of the People (Absent Voting at Local Elections) (Scotland) Regulations 2007 (conditions on the supply and inspection of absent voter records or lists) is amended as follows.
(2) In paragraph (1), for sub-paragraph (a) (but not the final “or”) substitute—
“(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”.
(3) After paragraph (1) insert—
“(2) In this regulation, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Representation of the People (Post-Local Government Elections Supply and Inspection of Documents) (Scotland) Regulations 2007 (S.S.I. 2007/264)
200I In regulation 5 of the Representation of the People (Post-Local Government Elections Supply and Inspection of Documents) (Scotland) Regulations 2007 (conditions on the use, supply and disclosure of documents open to public inspection)—
(a) in paragraph (2), for sub-paragraph (i) (but not the final “or”) substitute—
(i) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”, and
(b) after paragraph (3) insert—
“(4) In this regulation, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Education (Pupil Records and Reporting) (Transitional) Regulations (Northern Ireland) 2007 (S.R. (N.I.) 2007 No. 43)
200J The Education (Pupil Records and Reporting) (Transitional) Regulations (Northern Ireland) 2007 is amended as follows.
200K In regulation 2 (interpretation), at the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
200L In regulation 10(2) (duties of Boards of Governors), for “documents which are the subject of an order under section 30(2) of the Data Protection Act 1998” substitute “information to which the pupil to whom the information relates would have no right of access under the GDPR”.
Representation of the People (Northern Ireland) Regulations 2008 (S.I. 2008/1741)
200M In regulation 118 of the Representation of the People (Northern Ireland) Regulations 2008 (conditions on the use, supply and disclosure of documents open to public inspection)—
(a) in paragraph (2), for “research purposes within the meaning of that term in section 33 of the Data Protection Act 1998” substitute “purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics)”, and
(b) after paragraph (3) insert—
“(4) In this regulation, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Companies Act 2006 (Extension of Takeover Panel Provisions) (Isle of Man) Order 2008 (S.I. 2008/3122)
200N In paragraph 1(c) of the Schedule to the Companies Act 2006 (Extension of Takeover Panel Provisions) (Isle of Man) Order 2008 (modifications with which Chapter 1 of Part 28 of the Companies Act 2006 extends to the Isle of Man), for “the Data Protection Act 1998 (c 29)” substitute “the data protection legislation”.
Controlled Drugs (Supervision of Management and Use) (Wales) Regulations 2008 (S.I. 2008/3239 (W.286))
200O The Controlled Drugs (Supervision of Management and Use) (Wales) Regulations 2008 are amended as follows.
200P In regulation 2(1) (interpretation)—
(a) at the appropriate place in the English language text insert—
““the GDPR” (“y GDPR”) and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act);”, and
(b) at the appropriate place in the Welsh language text insert—
“mae i “y GDPR” a chyfeiriadau at Atodlen2 i Ddeddf Diogelu Data 2018 yr un ystyr ag a roddir i “the GDPR” a chyfeiriadau at yr Atodlen honno yn Rhannau 5 i 7 o’r Ddeddf honno (gweler adran3(10), (11) a (14) o’r Ddeddf honno);”.
200Q (1) Regulation 25 (duty to co-operate by disclosing information as regards relevant persons) is amended as follows.
(2) In paragraph (7)—
(a) in the English language text, at the end insert “or the GDPR”, and
(b) in the Welsh language text, at the end insert “neu’r GDPR”.
(3) For paragraph (8)—
(a) in the English language text substitute—
“(8) In determining for the purposes of paragraph (7) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”, and
(b) in the Welsh language text substitute—
“(8) Wrth benderfynu at ddibenion paragraff (7) a yw datgeliad wedi’i wahardd, mae i’w dybied at ddibenion paragraff 5(2) o Atodlen 2 i Ddeddf Diogelu Data 2018 a pharagraff 3(2) o Atodlen 11 i’r Ddeddf honno (esemptiadau rhag darpariaethau penodol o’r ddeddfwriaeth diogelu data: datgeliadau sy’n ofynnol gan y gyfraith) bod y datgeliad yn ofynnol gan y rheoliad hwn.”
200R (1) Regulation 26 (responsible bodies requesting additional information be disclosed about relevant persons) is amended as follows.
(2) In paragraph (6)—
(a) in the English language text, at the end insert “or the GDPR”, and
(b) in the Welsh language text, at the end insert “neu’r GDPR”.
(3) For paragraph (7)—
(a) in the English language text substitute—
“(7) In determining for the purposes of paragraph (6) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”, and
(b) in the Welsh language text substitute—
“(7) Wrth benderfynu at ddibenion paragraff (6) a yw datgeliad wedi’i wahardd, mae i’w dybied at ddibenion paragraff 5(2) o Atodlen 2 i Ddeddf Diogelu Data 2018 a pharagraff 3(2) o Atodlen 11 i’r Ddeddf honno (esemptiadau rhag darpariaethau penodol o’r ddeddfwriaeth diogelu data: datgeliadau sy’n ofynnol gan y gyfraith) bod y datgeliad yn ofynnol gan y rheoliad hwn.”
200S (1) Regulation 29 (occurrence reports) is amended as follows.
(2) In paragraph (3)—
(a) in the English language text, at the end insert “or the GDPR”, and
(b) in the Welsh language text, at the end insert “neu’r GDPR”.
(3) For paragraph (4)—
(a) in the English language text substitute—
“(4) In determining for the purposes of paragraph (3) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”, and
(b) in the Welsh language text substitute—
“(4) Wrth benderfynu at ddibenion paragraff (3) a yw datgeliad wedi’i wahardd, mae i’w dybied at ddibenion paragraff 5(2) o Atodlen 2 i Ddeddf Diogelu Data 2018 a pharagraff 3(2) o Atodlen 11 i’r Ddeddf honno (esemptiadau rhag darpariaethau penodol o’r ddeddfwriaeth diogelu data: datgeliadau sy’n ofynnol gan y gyfraith) bod y datgeliad yn ofynnol gan y rheoliad hwn.”
Energy Order 2003 (Supply of Information) Regulations (Northern Ireland) 2008 (S.R. (N.I.) 2008 No. 3)
200T (1) Regulation 5 of the Energy Order 2003 (Supply of Information) Regulations (Northern Ireland) 2008 (information whose disclosure would be affected by the application of other legislation) is amended as follows.
(2) In paragraph (3)—
(a) omit “within the meaning of section 1(1) of the Data Protection Act 1998”, and
(b) for the words from “where” to the end substitute “if the condition in paragraph (3A) or (3B) is satisfied”.
(3) After paragraph (3) insert—
“(3A) The condition in this paragraph is that the disclosure of the information to a member of the public—
(a) would contravene any of the data protection principles, or
(b) would do so if the exemptions in section24(1) of the Data Protection Act 2018 (manual unstructured data held by public authorities) were disregarded.
(3B) The condition in this paragraph is that the disclosure of the information to a member of the public would contravene—
(a) Article 21 of the GDPR (general processing: right to object to processing), or
(b) section99 of the Data Protection Act 2018 (intelligence services processing: right to object to processing).”
(4) After paragraph (4) insert—
“(5) In this regulation—
“the data protection principles” means the principles set out in—
(a) Article 5(1) of the GDPR,
(b) section34(1) of the Data Protection Act 2018, and
(c) section85(1) of that Act;
“the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(10), (11) and (14) of that Act);
“personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(2) and (14) of that Act).”
Companies (Disclosure of Address) Regulations 2009 (S.I. 2009/214)
200U (1) Paragraph 6 of Schedule 2 to the Companies (Disclosure of Address) Regulations 2009 (conditions for permitted disclosure to a credit reference agency) is amended as follows.
(2) The existing text becomes sub-paragraph (1).
(3) In paragraph (b) of that sub-paragraph, for sub-paragraph (ii) substitute—
(i) for the purposes of ensuring that it complies with its data protection obligations;”.
(4) In paragraph (c) of that sub-paragraph—
(a) omit “or” at the end of sub-paragraph (i), and
(b) at the end insert “; or
(i) section145 of the Data Protection Act 2018 (false statements made in response to an information notice);”.
(5) After paragraph (c) of that sub-paragraph insert—
“(d) has not been given a penalty notice under section154 of the Data Protection Act 2018 in circumstances described in paragraph (c)(ii), other than a penalty notice that has been cancelled.”
(6) After sub-paragraph (1) insert—
“(2) In this paragraph, “data protection obligations”, in relation to a credit reference agency, means—
(a) where the agency carries on business in the United Kingdom, obligations under the data protection legislation (as defined in section 3 of the Data Protection Act 2018);
(b) where the agency carries on business in a EEA State other than the United Kingdom, obligations under—
(i) the GDPR (as defined in section3(10) of the Data Protection Act 2018),
(ii) legislation made in exercise of powers conferred on member States under the GDPR (as so defined), and
(iii) legislation implementing the Law Enforcement Directive (as defined in section3(12) of the Data Protection Act 2018).”
Overseas Companies Regulations 2009 (S.I. 2009/1801)
200V (1) Paragraph 6 of Schedule 2 to the Overseas Companies Regulations 2009 (conditions for permitted disclosure to a credit reference agency) is amended as follows.
(2) The existing text becomes sub-paragraph (1).
(3) In paragraph (b) of that sub-paragraph, for sub-paragraph (ii) substitute—
(i) for the purposes of ensuring that it complies with its data protection obligations;”.
(4) In paragraph (c) of that sub-paragraph—
(a) omit “or” at the end of sub-paragraph (i), and
(b) at the end insert “; or
(i) section145 of the Data Protection Act 2018 (false statements made in response to an information notice);”.
(5) After paragraph (c) of that sub-paragraph insert—
“(d) has not been given a penalty notice under section154 of the Data Protection Act 2018 in circumstances described in paragraph (c)(ii), other than a penalty notice that has been cancelled.”
(6) After sub-paragraph (1) insert—
“(2) In this paragraph, “data protection obligations”, in relation to a credit reference agency, means—
(a) where the agency carries on business in the United Kingdom, obligations under the data protection legislation (as defined in section 3 of the Data Protection Act 2018);
(b) where the agency carries on business in a EEA State other than the United Kingdom, obligations under—
(i) the GDPR (as defined in section3(10) of the Data Protection Act 2018),
(ii) legislation made in exercise of powers conferred on member States under the GDPR (as so defined), and
(iii) legislation implementing the Law Enforcement Directive (as defined in section3(12) of the Data Protection Act 2018).”
Provision of Services Regulations 2009 (S.I. 2009/2999)
200W In regulation 25 of the Provision of Services Regulations 2009 (derogations from the freedom to provide services), for paragraph (d) substitute—
“(d) matters covered by Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.”
This amendment makes consequential amendments to secondary legislation including to the National Assembly for Wales Commission (Crown Status) Order 2007.
Amendment 223, in schedule 18, page 249, line 32, at end insert—
“INSPIRE (Scotland) Regulations 2009 (S.S.I. 2009/440)
201A (1) Regulation 10 of the INSPIRE (Scotland) Regulations 2009 (public access to spatial data sets and spatial data services) is amended as follows.
(2) In paragraph (2)—
(a) omit “or” at the end of sub-paragraph (a),
(b) for sub-paragraph (b) substitute—
“(b) Article 21 of the GDPR (general processing: right to object to processing), or
(c) section99 of the Data Protection Act 2018 (intelligence services processing: right to object to processing).”, and
(c) omit the words following sub-paragraph (b).
(3) After paragraph (6) insert—
“(7) In this regulation—
“the data protection principles” means the principles set out in—
(a) Article 5(1) of the GDPR,
(b) section34(1) of the Data Protection Act 2018, and
(c) section85(1) of that Act;
“the GDPR” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(10), (11) and (14) of that Act);
“personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(2) and (14) of that Act).
(8) In determining for the purposes of this regulation whether the lawfulness principle in Article 5(1)(a) of the GDPR would be contravened by the disclosure of information, Article 6(1) of the GDPR (lawfulness) is to be read as if the second sub-paragraph (disapplying the legitimate interests gateway in relation to public authorities) were omitted.”
Controlled Drugs (Supervision of Management and Use) Regulations (Northern Ireland) 2009 (S.R (N.I.) 2009 No. 225)
201B The Controlled Drugs (Supervision of Management and Use) Regulations (Northern Ireland) 2009 are amended as follows.
201C In regulation 2(2) (interpretation), at the appropriate place insert—
““the GDPR” and references to Schedule2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act);”.”
201D (1) Regulation 25 (duty to co-operate by disclosing information as regards relevant persons) is amended as follows.
(2) In paragraph (7), at the end insert “or the GDPR”.
(3) For paragraph (8) substitute—
“(8) In determining for the purposes of paragraph (7) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”
201E (1) Regulation 26 (responsible bodies requesting additional information be disclosed about relevant persons) is amended as follows.
(2) In paragraph (6), at the end insert “or the GDPR”.
(3) For paragraph (7) substitute—
“(7) In determining for the purposes of paragraph (6) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”
201F (1) Regulation 29 (occurrence reports) is amended as follows.
(2) In paragraph (3), at the end insert “or the GDPR”.
(3) For paragraph (4) substitute—
“(4) In determining for the purposes of paragraph (3) whether disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”
Pharmacy Order 2010 (S.I. 2010/231)
201G The Pharmacy Order 2010 is amended as follows.
201H In article 3(1) (interpretation), omit the definition of “Directive 95/46/EC”.
201I (1) Article 9 (inspection and enforcement) is amended as follows.
(2) For paragraph (4) substitute—
“(4) If a report that the Council proposes to publish pursuant to paragraph (3) includes personal data, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure of the personal data is required by paragraph (3) of this article.”
(3) After paragraph (4) insert—
“(5) In this article, “personal data” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(2) and (14) of that Act).”
201J In article 33A (European professional card), after paragraph (2) insert—
“(3) In Schedule 2A, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018.”
201K (1) Article 49 (disclosure of information: general) is amended as follows.
(2) In paragraph (2)(a), after “enactment” insert “or the GDPR”.
(3) For paragraph (3) substitute—
“(3) In determining for the purposes of paragraph (2)(a) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by paragraph (1) of this article.”
(4) After paragraph (5) insert—
“(6) In this article, “the GDPR” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(10), (11) and (14) of that Act).”
201L (1) Article 55 (professional performance assessments) is amended as follows.
(2) In paragraph (5)(a), after “enactment” insert “or the GDPR”.
(3) For paragraph (6) substitute—
“(6) In determining for the purposes of paragraph (5)(a) whether a disclosure is prohibited, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by paragraph (4) of this article.”
(4) After paragraph (8) insert—
“(9) In this article, “the GDPR” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(10), (11) and (14) of that Act).”
201M In article 67(6) (Directive 2005/36/EC: designation of competent authority etc.), after sub-paragraph (a) insert—
“(aa) “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
201N (1) Schedule 2A (Directive 2005/36/EC: European professional card) is amended as follows.
(2) In paragraph 8(1) (access to data), for “Directive 95/46/EC)” substitute “the GDPR”.
(3) In paragraph 9 (processing data)—
(a) omit sub-paragraph (2) (deeming the Council to be the controller for the purposes of Directive 95/46/EC), and
(b) after sub-paragraph (2) insert—
“(3) In this paragraph, “personal data” has the same meaning as in the Data Protection Act 2018 (see section 3(2) of that Act).”
201O (1) The table in Schedule 3 (Directive 2005/36/EC: designation of competent authority etc.) is amended as follows.
(2) In the entry for Article 56(2), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
(3) In the entry for Article 56a(4), in the second column, for “Directive 95/46/EC” substitute “the GDPR”.
National Employment Savings Trust Order 2010 (S.I. 2010/917)
201P The National Employment Savings Trust Order 2010 is amended as follows.
201Q In article 2 (interpretation)—
(a) omit the definition of “data” and “personal data”, and
(b) at the appropriate place insert—
““personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(2) and (14) of that Act).”
201R (1) Article 10 (disclosure of requested data to the Secretary of State) is amended as follows.
(2) In paragraph (1)—
(a) for “disclosure of data” substitute “disclosure of information”, and
(b) for “requested data” substitute “requested information”.
(3) In paragraph (2)—
(a) for “requested data” substitute “requested information”,
(b) for “those data are” substitute “the information is”, and
(c) for “receive those data” substitute “receive that information”.
(4) In paragraph (3), for “requested data” substitute “requested information”.
(5) In paragraph (4), for “requested data” substitute “requested information”.
Local Elections (Northern Ireland) Order 2010 (S.I. 2010/2977)
201S (1) Schedule 3 to the Local Elections (Northern Ireland) Order 2010 (access to marked registers and other documents open to public inspection after an election) is amended as follows.
(2) In paragraph 1(1) (interpretation and general)—
(a) omit the definition of “research purposes”, and
(b) at the appropriate places insert—
““Article 89 GDPR purposes” means the purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”;
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.
(3) In paragraph 5(3) (restrictions on the use, supply and disclosure of documents open to public inspection), for “research purposes” substitute “Article 89 GDPR purposes”.
Pupil Information (Wales) Regulations 2011 (S.I. 2011/1942 (W.209))
201T (1) Regulation 5 of the Pupil Information (Wales) Regulations 2011 (duties of head teacher - educational records) is amended as follows.
(2) In paragraph (5)—
(a) in the English language text, for “documents which are subject to any order under section 30(2) of the Data Protection Act 1998” substitute “information—
(a) which the head teacher could not lawfully disclose to the pupil under the GDPR, or
(b) to which the pupil would have no right of access under the GDPR.”, and
(b) in the Welsh language text, for “ddogfennau sy’n ddarostyngedig i unrhyw orchymyn o dan adran 30(2) o Ddeddf Diogelu Data 1998” substitute “wybodaeth—
(a) na allai’r pennaeth ei datgelu’n gyfreithlon i’r disgybl o dan y GDPR, neu
(b) na fyddai gan y disgybl hawl mynediad ati o dan y GDPR.”
(3) After paragraph (5)—
(a) in the English language text insert—
“(6) In this regulation, “the GDPR” (“y GDPR”) means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018.”, and
(b) in the Welsh language text insert—
“(6) Yn y rheoliad hwn, ystyr “y GDPR” (“the GDPR”) yw Rheoliad (EU) 2016/679 Senedd Ewrop a’r Cyngor dyddiedig 27 Ebrill 2016 ar ddiogelu personau naturiol o ran prosesu data personol a rhyddid symud data o’r fath (y Rheoliad Diogelu Data Cyffredinol), fel y’i darllenir ynghyd â Phennod 2 o Ran 2 o Ddeddf Diogelu Data 2018.”
Debt Arrangement Scheme (Scotland) Regulations 2011 (S.S.I. 2011/141)
201U In Schedule 4 to the Debt Arrangement Scheme (Scotland) Regulations 2011 (payments distributors), omit paragraph 2.
Police and Crime Commissioner Elections Order 2012 (S.I. 2012/1917)
201V The Police and Crime Commissioner Elections Order 2012 is amended as follows.
201W (1) Schedule 2 (absent voting in Police and Crime Commissioner elections) is amended as follows.
(2) In paragraph 20 (absent voter lists: supply of copies etc)—
(a) in sub-paragraph (8), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”, and
(b) after sub-paragraph (10) insert—
“(11) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
(3) In paragraph 24 (restriction on use of absent voter records or lists or the information contained in them)—
(a) in sub-paragraph (3), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics),”, and
(b) after that sub-paragraph insert—
“(4) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
201X (1) Schedule 10 (access to marked registers and other documents open to public inspection after an election) is amended as follows.
(2) In paragraph 1(2) (interpretation), omit paragraphs (c) and (d) (but not the final “and”).
(3) In paragraph 5 (restriction on use of documents or of information contained in them)—
(a) in sub-paragraph (3), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics),”, and
(b) after sub-paragraph (4) insert—
“(5) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Neighbourhood Planning (Referendums) Regulations 2012 (S.I. 2012/2031)
201Y Schedule 6 to the Neighbourhood Planning (Referendums) Regulations 2012 (registering to vote in a business referendum) is amended as follows.
201Z (1) Paragraph 29(1) (interpretation of Part 8) is amended as follows.
(2) At the appropriate places insert—
““Article 89 GDPR purposes” means the purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”;
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation);”.
(3) For the definition of “relevant conditions” substitute—
““relevant requirement” means the requirement under Article 89 of the GDPR, read with section19 of the Data Protection Act 2018, that personal data processed for Article 89 GDPR purposes must be subject to appropriate safeguards;”.
(4) Omit the definition of “research purposes”.
201AA In paragraph 32(3)(b)(i), for “section 11(3) of the Data Protection Act 1998” substitute “section123(5) of the Data Protection Act 2018”.
201AB In paragraph 33(6) and (7) (supply of copy of business voting register to the British Library and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
201AC In paragraph 34(6) and (7) (supply of copy of business voting register to the Office of National Statistics and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
201AD In paragraph 39(8) and (97) (supply of copy of business voting register to public libraries and local authority archives services and restrictions on use), for “research purposes in compliance with the relevant conditions” substitute “Article 89 GDPR purposes in accordance with the relevant requirement”.
201AE In paragraph 45(2) (conditions on the use, supply and disclosure of documents open to public inspection), for paragraph (a) (but not the final “or”) substitute—
(a) Article 89 GDPR purposes (as defined in paragraph 29),”.
Controlled Drugs (Supervision of Management and Use) Regulations 2013 (S.I. 2013/373)
201AF (1) Regulation 20 of the Controlled Drugs (Supervision of Management and Use) Regulations 2013 (information management) is amended as follows.
(2) For paragraph (4) substitute—
“(4) Where a CDAO, a responsible body or someone acting on their behalf is permitted to share information which includes personal data by virtue of a function under these Regulations, it is to be assumed for the purposes of paragraph 5(2) of Schedule 2 to the Data Protection Act 2018 and paragraph 3(2) of Schedule 11 to that Act (exemptions from certain provisions of the data protection legislation: disclosures required by law) that the disclosure is required by this regulation.”
(3) In paragraph (5), after “enactment” insert “or the GDPR”.
(4) After paragraph (6) insert—
“(7) In this regulation, “the GDPR”, “personal data” and references to Schedule 2 to the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section 3(2), (10), (11) and (14) of that Act).”
Communications Act 2003 (Disclosure of Information) Order 2014 (S.I. 2014/1825)
201AG (1) Article 3 of the Communications Act 2003 (Disclosure of Information) Order 2014 (specification of relevant functions) is amended as follows.
(2) The existing text becomes paragraph (1).
(3) In that paragraph, in sub-paragraph (a), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(4) After that paragraph insert—
“(2) In this article, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).””
This amendment makes consequential amendments to secondary legislation.
Amendment 224, in schedule 18, page 250, line 7, at end insert—
“Companies (Disclosure of Date of Birth Information) Regulations 2015 (S.I. 2015/1694)
204A (1) Paragraph 6 of Schedule 2 to the Companies (Disclosure of Date of Birth Information) Regulations 2015 (conditions for permitted disclosure to a credit reference agency) is amended as follows.
(2) The existing text becomes sub-paragraph (1).
(3) In paragraph (b) of that sub-paragraph, for sub-paragraph (ii) substitute—
(i) for the purposes of ensuring that it complies with its data protection obligations;”.
(4) In paragraph (c) of that sub-paragraph—
(a) omit “or” at the end of sub-paragraph (i), and
(b) at the end insert “; or
(i) section145 of the Data Protection Act 2018 (false statements made in response to an information notice);”.
(5) After paragraph (c) of that sub-paragraph insert—
“(d) has not been given a penalty notice under section154 of the Data Protection Act 2018 in circumstances described in paragraph (c)(ii), other than a penalty notice that has been cancelled.”
(6) After sub-paragraph (1) insert—
“(2) In this paragraph, “data protection obligations”, in relation to a credit reference agency, means—
(a) where the agency carries on business in the United Kingdom, obligations under the data protection legislation (as defined in section 3 of the Data Protection Act 2018);
(b) where the agency carries on business in a EEA State other than the United Kingdom, obligations under—
(i) the GDPR (as defined in section3(10) of the Data Protection Act 2018),
(ii) legislation made in exercise of powers conferred on member States under the GDPR (as so defined), and
(iii) legislation implementing the Law Enforcement Directive (as defined in section3(12) of the Data Protection Act 2018).”
Small and Medium Sized Business (Credit Information) Regulations 2015 (S.I. 2015/1945)
204B The Small and Medium Sized Business (Credit Information) Regulations 2015 are amended as follows.
204C (1) Regulation 12 (criteria for the designation of a credit reference agency) is amended as follows.
(2) In paragraph (1)(b), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) After paragraph (2) insert—
“(3) In this regulation, “the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act).”
204D (1) Regulation 15 (access to and correction of information for individuals and small firms) is amended as follows.
(2) For paragraph (1) substitute—
“(1) Section 13 of the Data Protection Act 2018 (rights of the data subject under the GDPR: obligations of credit reference agencies) applies in respect of a designated credit reference agency which is not a credit reference agency within the meaning of section 145(8) of the Consumer Credit Act 1974 as if it were such an agency.”
(3) After paragraph (3) insert—
“(4) In this regulation, the reference to section 13 of the Data Protection Act 2018 has the same meaning as in Parts 5 to 7 of that Act (see section 3(14) of that Act).”
European Union (Recognition of Professional Qualifications) Regulations 2015 (S.I. 2015/2059)
204E The European Union (Recognition of Professional Qualifications) Regulations 2015 are amended as follows.
204F (1) Regulation 2(1) (interpretation) is amended as follows.
(2) Omit the definition of “Directive 95/46/EC”.
(3) At the appropriate place insert—
““the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), read with Chapter 2 of Part 2 of the Data Protection Act 2018;”.
204G In regulation 5(5) (functions of competent authorities in the United Kingdom) for “Directives 95/46/EC” substitute “the GDPR and Directive”.
204H In regulation 45(3) (processing and access to data regarding the European Professional Card), for “Directive 95/46/EC” substitute “the GDPR”.
204I In regulation 46(1) (processing and access to data regarding the European Professional Card), for “Directive 95/46/EC” substitute “the GDPR”.
204J In regulation 48(2) (processing and access to data regarding the European Professional Card), omit paragraph (2) (deeming the relevant designated competent authorities to be controllers for the purposes of Directive 95/46/EC).
204K In regulation 66(3) (exchange of information), for “Directives 95/46/EC” substitute “the GDPR and Directive”.
Scottish Parliament (Elections etc) Order 2015 (S.S.I. 2015/425)
204L The Scottish Parliament (Elections etc) Order 2015 is amended as follows.
204M (1) Schedule 3 (absent voting) is amended as follows.
(2) In paragraph 16 (absent voting lists: supply of copies etc)—
(a) in sub-paragraph (4), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”, and
(b) after sub-paragraph (10) insert—
“(11) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
(3) In paragraph 20 (restriction on use of absent voting lists)—
(a) in sub-paragraph (3), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”, and
(b) after that sub-paragraph insert—
“(4) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
204N (1) Schedule 8 (access to marked registers and other documents open to public inspection after an election) is amended as follows.
(2) In paragraph 1(2) (interpretation), omit paragraphs (c) and (d) (but not the final “and”).
(3) In paragraph 5 (restriction on use of documents or of information contained in them)—
(a) in sub-paragraph (3), for paragraph (a) (but not the final “or”) substitute—
(a) purposes mentioned in Article 89(1) of the GDPR (archiving in the public interest, scientific or historical research and statistics);”, and
(b) after sub-paragraph (4) insert—
“(5) In this paragraph, “the GDPR” means Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation).”
Recall of MPs Act 2015 (Recall Petition) Regulations 2016 (S.I. 2016/295)
204O In paragraph 1(3) of Schedule 3 to the Recall of MPs Act 2015 (Recall Petition) Regulations 2016 (access to marked registers after a petition), omit the definition of “relevant conditions”.
Register of People with Significant Control Regulations 2016 (S.I. 2016/339)
204P Schedule 4 to the Register of People with Significant Control Regulations 2016 (conditions for permitted disclosure) is amended as follows.
204Q (1) Paragraph 6 (disclosure to a credit reference agency) is amended as follows.
(2) In sub-paragraph (b), for paragraph (ii) (together with the final “; and”) substitute—
(i) for the purposes of ensuring that it complies with its data protection obligations;”.
(3) In sub-paragraph (c)—
(a) omit “or” at the end of paragraph (ii), and
(b) at the end insert “; or
(i) section145 of the Data Protection Act 2018 (false statements made in response to an information notice); and”.
(4) After sub-paragraph (c) insert—
“(d) has not been given a penalty notice under section154 of the Data Protection Act 2018 in circumstances described in sub-paragraph (c)(iii), other than a penalty notice that has been cancelled.”
204R In paragraph 12A (disclosure to a credit institution or a financial institution), for sub-paragraph (b) substitute—
(b) for the purposes of ensuring that it complies with its data protection obligations.”
204S (1) In Part 3 (interpretation), after paragraph 13 insert—
14 In this Schedule, “data protection obligations”, in relation to a credit reference agency, a credit institution or a financial institution, means—
(a) where the agency or institution carries on business in the United Kingdom, obligations under the data protection legislation (as defined in section 3 of the Data Protection Act 2018);
(b) where the agency or institution carries on business in a EEA State other than the United Kingdom, obligations under—
(i) the GDPR (as defined in section3(10) of the Data Protection Act 2018),
(ii) legislation made in exercise of powers conferred on member States under the GDPR (as so defined), and
(iii) legislation implementing the Law Enforcement Directive (as defined in section3(12) of the Data Protection Act 2018).”
Electronic Identification and Trust Services for Electronic Transactions Regulations 2016 (S.I. 2016/696)
204T The Electronic Identification and Trust Services for Electronic Transactions Regulations 2016 are amended as follows.
204U In regulation 2(1) (interpretation), omit the definition of “the 1998 Act”.
204V In regulation 3(3) (supervision), omit “under the 1998 Act”.
204W For Schedule 2 substitute—
SCHEDULE 2
Information commissioner’s enforcement powers
Provisions applied for enforcement purposes
1 For the purposes of enforcing these Regulations and the eIDAS Regulation, the following provisions of Parts 5 to 7 of the Data Protection Act 2018 apply with the modifications set out in paragraphs 2 to 24—
(a) section 140 (publication by the Commissioner);
(b) section 141 (notices from the Commissioner);
(c) section 143 (information notices);
(d) section 144 (information notices: restrictions);
(e) section 145 (false statements made in response to an information notice);
(f) section 146 (assessment notices);
(g) section 147 (assessment notices: restrictions);
(h) section 148 (enforcement notices);
(i) section 149 (enforcement notices: supplementary);
(j) section 151 (enforcement notices: restrictions);
(k) section 152 (enforcement notices: cancellation and variation);
(l) section 153 and Schedule 15 (powers of entry and inspection);
(m) section 154 and Schedule 16 (penalty notices);
(n) section 155(4)(a) (penalty notices: restrictions);
(o) section 156 (maximum amount of penalty);
(p) section 158 (amount of penalties: supplementary);
(q) section 159 (guidance about regulatory action);
(r) section 160 (approval of first guidance about regulatory action);
(s) section 161 (rights of appeal);
(t) section 162 (determination of appeals);
(u) section 179(1), (2), (5), (7) and (12) (regulations and consultation);
(v) section 189 (penalties for offences);
(w) section 190 (prosecution);
(x) section 195 (proceedings in the First-tier Tribunal: contempt);
(y) section 196 (Tribunal Procedure Rules).
General modification of references to the Data Protection Act 2018
2 The provisions listed in paragraph 1 have effect as if—
(a) references to the Data Protection Act 2018 were references to the provisions of that Act as applied by these Regulations;
(b) references to a particular provision of that Act were references to that provision as applied by these Regulations.
Modification of section143 (information notices)
3 (1) Section 143 has effect as if subsections (9) and (10) were omitted.
(2) In that section, subsection (1) has effect as if—
(a) in paragraph (a)—
(i) for “controller or processor” there were substituted “trust service provider”;
(ii) for “the data protection legislation” there were substituted “the eIDAS Regulation and the EITSET Regulations”;
(b) paragraph (b) were omitted.
Modification of section144 (information notices: restrictions)
4 (1) Section 144 has effect as if subsections (1) and (9) were omitted.
(2) In that section—
(a) subsections (3)(b) and (4)(b) have effect as if for “the data protection legislation” there were substituted “the eIDAS Regulation or the EITSET Regulations”;
(b) subsection (7)(a) has effect as if for “this Act” there were substituted “section 145 or paragraph 15 of Schedule 15”;
(c) subsection (8) has effect as if for “this Act (other than an offence under section 145)” there were substituted “paragraph 15 of Schedule 15”.
Modification of section146 (assessment notices)
5 (1) Section 146 has effect as if subsection (10) were omitted.
(2) In that section—
(a) subsection (1) has effect as if—
(i) for “controller or processor” (in both places) there were substituted “trust service provider”;
(ii) for “the data protection legislation” there were substituted “the eIDAS requirements”;
(b) subsection (2) has effect as if paragraphs (g) and (h) were omitted;
(c) subsections (7), (8) and (9) have effect as if for “controller or processor” (in each place) there were substituted “trust service provider”.
Modification of section147(assessment notices: restrictions)
6 (1) Section 147 has effect as if subsections (5) and (6) were omitted.
(2) In that section, subsections (2)(b) and (3)(b) have effect as if for “the data protection legislation” there were substituted “the eIDAS Regulation or the EITSET Regulations”.
Modification of section148 (enforcement notices)
7 (1) Section 148 has effect as if subsections (2) to (5) and (7) to (9) were omitted.
(2) In that section—
(a) subsection (1) has effect as if—
(i) for “as described in subsection (2), (3), (4) or (5)” there were substituted “to comply with the eIDAS requirements”;
(ii) for “sections149 and150” there were substituted “section149”;
(b) subsection (6) has effect as if the words “given in reliance on subsection (2), (3) or (5)” were omitted.
Modification of section149 (enforcement notices: supplementary)
8 (1) Section 149 has effect as if subsection (3) were omitted.
(2) In that section, subsection (2) has effect as if the words “in reliance on section 148(2)” and “or distress” were omitted.
Modification of section151 (enforcement notices: restrictions)
9 Section151 has effect as if subsections (1), (2) and (4) were omitted.
Withdrawal notices
10 The provisions listed in paragraph 1 have effect as if after section152 there were inserted—
“Withdrawal notices
152A Withdrawal notices
(1) The Commissioner may, by written notice (a “withdrawal notice”), withdraw the qualified status from a trust service provider, or the qualified status of a service provided by a trust service provider, if—
(a) the Commissioner is satisfied that the trust service provider has failed to comply with an information notice or an enforcement notice, and
(b) the condition in subsection (2) or (3) is met.
(2) The condition in this subsection is met if the period for the trust service provider to appeal against the information notice or enforcement notice has ended without an appeal having been brought.
(3) The condition in this subsection is met if an appeal against the information notice or enforcement notice has been brought and—
(a) the appeal and any further appeal in relation to the notice has been decided or has otherwise ended, and
(b) the time for appealing against the result of the appeal or further appeal has ended without another appeal having been brought.
(4) A withdrawal notice must—
(a) state when the withdrawal takes effect, and
(b) provide information about the rights of appeal under section161.”
Modification of Schedule15 (powers of entry and inspection)
11 (1) Schedule 15 has effect as if paragraph 3 were omitted.
(2) Paragraph 1(1) of that Schedule (issue of warrants in connection with non-compliance and offences) has effect as if for paragraph (a) (but not the final “and”) there were substituted—
(a) there are reasonable grounds for suspecting that—
(i) a trust service provider has failed or is failing to comply with the eIDAS requirements, or
(ii) an offence under section145 or paragraph 15 of Schedule15 has been or is being committed,”.
(3) Paragraph 2 of that Schedule (issue of warrants in connection with assessment notices) has effect as if—
(a) in sub-paragraph (1) and (2), for “controller or processor” there were substituted “trust service provider”;
(b) in sub-paragraph (2), for “the data protection legislation” there were substituted “the eIDAS requirements”.
(4) Paragraph 5 of that Schedule (content of warrants) has effect as if—
(a) in sub-paragraph (1)(c), for “the processing of personal data” there were substituted “the provision of trust services”;
(b) in sub-paragraph (2)(c)—
(i) for “controller or processor” there were substituted “trust service provider”;
(ii) for “as described in section148(2)” there were substituted “to comply with the eIDAS requirements”;
(c) in sub-paragraph (3)(a) and (c)—
(i) for “controller or processor” there were substituted “trust service provider”;
(ii) for “the data protection legislation” there were substituted “the eIDAS requirements”.
(5) Paragraph 11 of that Schedule (privileged communications) has effect as if, in sub-paragraphs (1)(b) and (2)(b), for “the data protection legislation” there were substituted “the eIDAS Regulation or the EITSET Regulations”.
Modification of section154 (penalty notices)
12 (1) Section 154 has effect as if subsections (1)(a), (2)(a), (3)(g), (3A) and (5) to (7) were omitted.
(2) Subsection (2) of that section has effect as if—
(a) the words “Subject to subsection (3A),” were omitted;
(b) in paragraph (b), the words “to the extent that the notice concerns another matter,” were omitted.
(3) Subsection (3) of that section has effect as if—
(a) for “controller or processor”, in each place, there were substituted “trust services provider”;
(b) in paragraph (c), the words “or distress” were omitted;
(c) in paragraph (c), for “data subjects” there were substituted “relying parties”;
(d) in paragraph (d), for “section 57, 66, 103 or 107” there were substituted “Article 19(1) of the eIDAS Regulation”.
Modification of Schedule16 (penalties)
13 Schedule16 has effect as if paragraphs 3(2)(b) and 5(2)(b) were omitted.
Modification of section156 (maximum amount of penalty)
14 Section156 has effect as if subsections (1) to (3) and (6) were omitted.
Modification of section158 (amount of penalties: supplementary)
15 Section158 has effect as if—
(a) in subsection (1), the words “Article 83 of the GDPR and” were omitted;
(b) in subsection (2), the words “Article 83 of the GDPR” and “and section 157” were omitted.
Modification of section159 (guidance about regulatory action)
16 (1) Section 159 has effect as if subsections (4) and (10) were omitted.
(2) In that section, subsection (3)(e) has effect as if for “controllers and processors” there were substituted “trust service providers”.
Modification of section161 (rights of appeal)
17 (1) Section 161 has effect as if subsection (5) were omitted.
(2) In that section, subsection (1) has effect as if, after paragraph (c), there were inserted—
(ca) a withdrawal notice;”.
Modification of section162 (determination of appeals)
18 Section162 has effect as if subsection (7) were omitted.
Modification of section179 (regulations and consultation)
19 Section179 has effect as if subsections (3), (4), (6), (8) to (11) and (13) were omitted.
Modification of section189 (penalties for offences)
20 (1) Section 189 has effect as if subsections (3) to (5) were omitted.
(2) In that section—
(a) subsection (1) has effect as if the words “section 119 or 173 or” were omitted;
(b) subsection (2) has effect as if for “section 132, 145, 170, 171 or 181” there were substituted “section 145”.
Modification of section190 (prosecution)
21 Section190 has effect as if subsections (3) to (6) were omitted.
Modification of section195 (proceedings in the First-tier Tribunal: contempt)
22 Section195 has effect as if in subsection (1)(a), for sub-paragraphs (i) and (ii) there were substituted “on an appeal under section161”.
Modification of section196 (Tribunal Procedure Rules)
23 Section196 has effect as if—
(a) in subsection (1), for paragraphs (a) and (b) there were substituted “the exercise of the rights of appeal conferred by section 161”;
(b) in subsection (2)(a) and (b), for “the processing of personal data” there were substituted “the provision of trust services”.
Approval of first guidance about regulatory action
24 (1) This paragraph applies if the first guidance produced under section 159(1) of the Data Protection Act 2018 and the first guidance produced under that provision as applied by this Schedule are laid before Parliament as a single document (“the combined guidance”).
(2) Section 160 of that Act (including that section as applied by this Schedule) has effect as if the references to “the guidance” were references to the combined guidance, except in subsections (2)(b) and (4).
(3) Nothing in subsection (2)(a) of that section (including as applied by this Schedule) prevents another version of the combined guidance being laid before Parliament.
(4) Any duty under subsection (2)(b) of that section (including as applied by this Schedule) may be satisfied by producing another version of the combined guidance.
Interpretation
25 In this Schedule—
“the eIDAS requirements” means the requirements of Chapter III of the eIDAS Regulation;
“the EITSET Regulations” means these Regulations;
“withdrawal notice” has the meaning given in section 146A of the Data Protection Act 2018 (as inserted in that Act by this Schedule).”
Court Files Privileged Access Rules (Northern Ireland) 2016 (S.R. (N.I.) 2016 No. 123)
204X The Court Files Privileged Access Rules (Northern Ireland) 2016 are amended as follows.
204Y In rule 5 (information that may released) for “Schedule 1 of the Data Protection Act 1998” substitute “—
(a) Article 5(1) of the GDPR, and
(b) section34(1) of the Data Protection Act 2018.”
204Z In rule 7(2) (provision of information) for “Schedule 1 of the Data Protection Act 1998” substitute “—
(a) Article 5(1) of the GDPR, and
(b) section34(1) of the Data Protection Act 2018.”
Money Laundering, Terrorist Financing and Transfer of Funds (Information on the Payer) Regulations 2017 (S.I. 2017/692)
204AA The Money Laundering, Terrorist Financing and Transfer of Funds (Information on the Payer) Regulations 2017 are amended as follows.
204AB In regulation 3(1) (interpretation), at the appropriate places insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”;
““the GDPR” and references to provisions of Chapter 2 of Part 2 of the Data Protection Act 2018 have the same meaning as in Parts 5 to 7 of that Act (see section3(10), (11) and (14) of that Act);”.
204AC In regulation 16(8) (risk assessment by the Treasury and Home Office), for “the Data Protection Act 1998 or any other enactment” substitute “—
(a) the Data Protection Act 2018 or any other enactment, or
(b) the GDPR.”
204AD In regulation 17(9) (risk assessment by supervisory authorities), for “the Data Protection Act 1998 or any other enactment” substitute “—
(a) the Data Protection Act 2018 or any other enactment, or
(b) the GDPR.”
204AE For regulation 40(9)(c) (record keeping) substitute—
(c) “data subject” has the same meaning as in the Data Protection Act 2018 (see section 3 of that Act);
(b) “personal data” has the same meaning as in Parts 5 to 7 of that Act (see section 3(2) and (14) of that Act).”
204AF (1) Regulation 41 (data protection) is amended as follows.
(2) Omit paragraph (2).
(3) In paragraph (3)(a), after “Regulations” insert “or the GDPR”.
(4) Omit paragraphs (4) and (5).
(5) After those paragraphs insert—
“(6) Before establishing a business relationship or entering into an occasional transaction with a new customer, as well as providing the customer with the information required under Article 13 of the GDPR (information to be provided where personal data are collected from the data subject), relevant persons must provide the customer with a statement that any personal data received from the customer will be processed only—
(a) for the purposes of preventing money laundering or terrorist financing, or
(b) as permitted under paragraph (3).
(7) In Article 6(1) of the GDPR (lawfulness of processing), the reference in point (e) to processing of personal data that is necessary for the performance of a task carried out in the public interest includes processing of personal data in accordance with these Regulations that is necessary for the prevention of money laundering or terrorist financing.
(8) In the case of sensitive processing of personal data for the purposes of the prevention of money laundering or terrorist financing, section 10 of, and Schedule 1 to, the Data Protection Act 2018 make provision about when the processing meets a requirement in Article 9(2) or 10 of the GDPR for authorisation under the law of the United Kingdom (see, for example, paragraphs 9, 10 and 10A of that Schedule).
(9) In this regulation—
“data subject” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);
“personal data” and “processing” have the same meaning as in Parts 5 to 7 of that Act (see section3(2), (4) and (14) of that Act);
“sensitive processing” means the processing of personal data described in Article 9(1) or 10 of the GDPR (special categories of personal data and personal data relating to criminal convictions and offences etc).”
204AG (1) Regulation 84 (publication: the Financial Conduct Authority) is amended as follows.
(2) In paragraph (10), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) For paragraph (11) substitute—
“(11) For the purposes of this regulation, “personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(2) and (14) of that Act).”
204AH (1) Regulation 85 (publication: the Commissioners) is amended as follows.
(2) In paragraph (9), for “the Data Protection Act 1998” substitute “the data protection legislation”.
(3) For paragraph (10) substitute—
“(10) For the purposes of this regulation, “personal data” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section 3(2) and (14) of that Act).”
204AI For regulation 106(a) (general restrictions) substitute—
“(a) a disclosure in contravention of the data protection legislation; or”.
204AJ After paragraph 27 of Schedule 3 (relevant offences) insert—
27A An offence under the Data Protection Act 2018, apart from an offence under section173 of that Act.”
Scottish Partnerships (Register of People with Significant Control) Regulations 2017 (S.I. 2017/694)
204AK (1) Paragraph 6 of Schedule 5 to the Scottish Partnerships (Register of People with Significant Control) Regulations 2017 (conditions for permitted disclosure to a credit institution or a financial institution) is amended as follows.
(2) The existing text becomes sub-paragraph (1).
(3) For paragraph (b) of that sub-paragraph substitute—
(b) for the purposes of ensuring that it complies with its data protection obligations.”
(4) After sub-paragraph (1) insert—
“(2) In this paragraph, “data protection obligations”, in relation to a relevant institution, means—
(a) where the institution carries on business in the United Kingdom, obligations under the data protection legislation (as defined in section 3 of the Data Protection Act 2018);
(b) where the institution carries on business in a EEA State other than the United Kingdom, obligations under—
(i) the GDPR (as defined in section3(10) of the Data Protection Act 2018),
(ii) legislation made in exercise of powers conferred on member States under the GDPR (as so defined), and
(iii) legislation implementing the Law Enforcement Directive (as defined in section3(12) of the Data Protection Act 2018).
National Health Service (General Medical Services Contracts) (Scotland) Regulations 2018 (S.S.I. 2018/66)
204AL The National Health Service (General Medical Services Contracts) (Scotland) Regulations 2018 are amended as follows.
204AM (1) Regulation 1 (citation and commencement) is amended as follows.
(2) In paragraph (2), omit “Subject to paragraph (3),”.
(3) Omit paragraph (3).
204AN In regulation 3(1) (interpretation)—
(a) omit the definition of “the 1998 Act”,
(b) at the appropriate place insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”, and
(c) omit the definition of “GDPR”.
204AO (1) Schedule 6 (other contractual terms) is amended as follows.
(2) In paragraph 63(2) (interpretation: general), for “the 1998 Act or any directly applicable EU instrument relating to data protection” substitute “—
(a) the data protection legislation, or
(b) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection.”
(3) For paragraph 64 (meaning of data controller etc.) substitute—
“Meaning of controller etc.
64A For the purposes of this Part—
“controller” has the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(6) and (14) of that Act);
“data protection officer” means a person designated as a data protection officer under the data protection legislation;
“personal data” and “processing” have the same meaning as in Parts 5 to 7 of the Data Protection Act 2018 (see section3(2), (4) and (14) of that Act).”
(4) In paragraph 65(2)(b) (roles, responsibilities and obligations: general), for “data controllers” substitute “controllers”.
(5) In paragraph 69(2)(a) (processing and access of data), for “the 1998 Act, and any directly applicable EU instrument relating to data protection;” substitute “—
(i) the data protection legislation, and
(ii) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection;”.
(6) In paragraph 94(4) (variation of a contract: general)—
(a) omit paragraph (b), and
(b) after paragraph (d) (but before the final “and”) insert—
“(da) the data protection legislation;
(db) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection;”.
National Health Service (Primary Medical Services Section 17C Agreements) (Scotland) Regulations 2018 (S.S.I. 2018/67)
204AP The National Health Service (Primary Medical Services Section 17C Agreements) (Scotland) Regulations 2018 are amended as follows.
204AQ (1) Regulation 1 (citation and commencement) is amended as follows.
(2) In paragraph (2), omit “Subject to paragraph (3),”.
(3) Omit paragraph (3).
204AR In regulation 3(1) (interpretation)—
(a) omit the definition of “the 1998 Act”, and
(b) at the appropriate place insert—
““the data protection legislation” has the same meaning as in the Data Protection Act 2018 (see section3 of that Act);”, and
(c) omit the definition of “GDPR”.
204AS (1) Schedule 1 (content of agreements) is amended as follows.
(2) In paragraph 34 (interpretation)—
(a) in sub-paragraph (1)—
(i) omit “Subject to sub-paragraph (3),”,
(ii) before paragraph (a) insert—
(iii) for paragraph (d) substitute—
(b) omit sub-paragraphs (2) and (3),
(c) in sub-paragraph (4), for “the 1998 Act and any directly applicable EU instrument relating to data protection” substitute “—
(a) the data protection legislation, or
(b) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection.”, and
(d) in sub-paragraph (6)(b), for “data controllers” substitute “controllers”.
(3) In paragraph 37(2)(a) (processing and access of data), for “the 1998 Act, and any directly applicable EU instrument relating to data protection;” substitute “—
(i) the data protection legislation, and
(ii) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection;”.
(4) In paragraph 61(3) (variation of agreement: general)—
(a) omit paragraph (b), and
(b) after paragraph (d) (but before the final “and”) insert—
“(da) the data protection legislation;
(db) any directly applicable EU legislation which is not part of the data protection legislation but which relates to data protection;”.
Part 3
Modifications
Introduction
204AT (1) Unless the context otherwise requires, legislation described in sub-paragraph (2) has effect on and after the day on which this Part of this Schedule comes into force as if it were modified in accordance with this Part of this Schedule.
(2) That legislation is—
(a) subordinate legislation made before the day on which this Part of this Schedule comes into force;
(b) primary legislation that is passed or made before the end of the Session in which this Act is passed.
(3) In this Part of this Schedule—
“primary legislation” has the meaning given in section204(7);
“references” includes any references, however expressed.
General modifications
204AU (1) References to a particular provision of, or made under, the Data Protection Act 1998 have effect as references to the equivalent provision or provisions of, or made under, the data protection legislation.
(2) Other references to the Data Protection Act 1998 have effect as references to the data protection legislation.
(3) References to disclosure, use or other processing of information that is prohibited or restricted by an enactment which include disclosure, use or other processing of information that is prohibited or restricted by the Data Protection Act 1998 have effect as if they included disclosure, use or other processing of information that is prohibited or restricted by the GDPR or the applied GDPR.
Specific modification of references to terms used in the Data Protection Act 1998
204AV (1) References to personal data, and to the processing of such data, as defined in the Data Protection Act 1998, have effect as references to personal data, and to the processing of such data, as defined for the purposes of Parts 5 to 7 of this Act (see section 3(2), (4) and (14)).
(2) References to processing as defined in the Data Protection Act 1998, in relation to information, have effect as references to processing as defined in section 3(4).
(3) References to a data subject as defined in the Data Protection Act 1998 have effect as references to a data subject as defined in section 3(5).
(4) References to a data controller as defined in the Data Protection Act 1998 have effect as references to a controller as defined for the purposes of Parts 5 to 7 of this Act (see section 3(6) and (14)).
(5) References to the data protection principles set out in the Data Protection Act 1998 have effect as references to the principles set out in—
(a) Article 5(1) of the GDPR and the applied GDPR, and
(b) sections 34(1) and 85(1) of this Act.
(6) References to direct marketing as defined in section 11 of the Data Protection Act 1998 have effect as references to direct marketing as defined in section 123 of this Act.
(7) References to a health professional within the meaning of section 69(1) of the Data Protection Act 1998 have effect as references to a health professional within the meaning of section 197 of this Act.
(8) References to a health record within the meaning of section 68(2) of the Data Protection Act 1998 have effect as references to a health record within the meaning of section 198 of this Act.
Part 2
Supplementary
Definitions
204AW Section3(14) does not apply to this Schedule.”
This amendment makes consequential amendments to secondary legislation including to the Electronic Identification and Trust Services for Electronic Transactions Regulations 2016 (the EITSET Regulations) and to the Money Laundering, Terrorist Financing and Transfer of Funds (Information on the Payer) Regulations 2017. It also inserts two new Parts into Schedule 18. New Part 3 contains consequential modifications of provisions in certain legislation not amended by Parts 1 and 2 of Schedule 18. New Part 4 contains supplementary provision.—(Margot James.)
Schedule 18, as amended, ordered to stand part of the Bill.
Clause 205
Commencement
Amendments made: 72, in clause 205, page 120, line 37, leave out paragraph (b)
This amendment is consequential on the omission of Clauses 168 and 169 (see Amendments 60 and 61).
Amendment 225, in clause 205, page 121, line 4, at end insert—
‘( ) Regulations under this section may make different provision for different areas.”
This amendment enables regulations under clause 205 bringing provisions of the bill into force to make different provision for different areas.—(Margot James.)
Clause 205, as amended, ordered to stand part of the Bill.
Clause 206 ordered to stand part of the Bill.
Clause 207
Extent
Amendments made: 73, in clause 207, page 121, line 12, after “(2)” insert “, (2A)”
See the explanatory statement for Amendment 74.
Amendment 226, in clause 207, page 121, line 12, leave out “and (3)” and insert “, (3) and (3A)”
See the explanatory statement for amendment 227.
Amendment 74, in clause 207, page 121, line 14, at end insert—
‘(2A) Sections (Representation of data subjects with their authority: collective proceedings) and (Duty to review provision for representation of data subjects) extend to England and Wales and Northern Ireland only.”
This amendment and Amendment 73 provide that NC1 and NC2 extend only to England and Wales and Northern Ireland.
Amendment 227, in clause 207, page 121, line 15, after “extent” insert “in the United Kingdom”
This amendment and amendments 226, 228 and 229 clarify that amendments of enactments made by the bill have the same extent in the United Kingdom as the enactment amended and that certain amendments also extend to the Isle of Man.
Amendment 228, in clause 207, page 121, line 16, leave out “(ignoring extent by virtue of an Order in Council)”
See the explanatory statement for amendment 227.
Amendment 229, in clause 207, page 121, line 17, at end insert—
‘(3A) This subsection and the following provisions also extend to the Isle of Man—
(a) paragraphs 200N and 205 of Schedule18;
(b) sections204(1),205(1) and206, so far as relating to those paragraphs.”
See the explanatory statement for amendment 227. Paragraph 200N in amendment 222 amends the Competition Act 2006 (Extension of Takeover Panel Provisions) (Isle of Man) Order 2008.—(Margot James.)
Clause 207, as amended, ordered to stand part of the Bill.
Clause 208
Short title
Amendment made: 75, in clause 208, page 121, line 24, leave out subsection (2)
This amendment removes the privilege amendment inserted by the Lords.—(Margot James.)
Clause 208, as amended, ordered to stand part of the Bill.
New Clause 1
Representation of data subjects with their authority: collective proceedings
‘(1) The Secretary of State may by regulations make provision for representative bodies to bring proceedings before a court or tribunal in England and Wales or Northern Ireland combining two or more relevant claims.
(2) In this section, “relevant claim”, in relation to a representative body, means a claim in respect of a right of a data subject which the representative body is authorised to exercise on the data subject’s behalf under Article 80(1) of the GDPR or section 183.
(3) The power under subsection (1) includes power—
(a) to make provision about the proceedings;
(b) to confer functions on a person, including functions involving the exercise of a discretion;
(c) to make different provision in relation to England and Wales and in relation to Northern Ireland.
(4) The provision mentioned in subsection (3)(a) includes provision about—
(a) the effect of judgments and orders;
(b) agreements to settle claims;
(c) the assessment of the amount of compensation;
(d) the persons to whom compensation may or must be paid, including compensation not claimed by the data subject;
(e) costs.
(5) Regulations under this section are subject to the negative resolution procedure.”
This new clause confers power on the Secretary of State to make regulations enabling representative bodies (defined in Clause 183) to bring collective proceedings in England and Wales or Northern Ireland combining two or more claims in respect of data subjects’ rights.—(Margot James.)
Brought up, read the First and Second time, and added to the Bill.
New Clause 2
Duty to review provision for representation of data subjects
‘(1) Before the end of the review period, the Secretary of State must—
(a) review the matters listed in subsection (2) in relation to England and Wales and Northern Ireland,
(b) prepare a report of the review, and
(c) lay a copy of the report before Parliament.
(2) Those matters are—
(a) the operation of Article 80(1) of the GDPR,
(b) the operation of section183,
(c) the merits of exercising the power under Article 80(2) of the GDPR (power to enable a body or other organisation which meets the conditions in Article 80(1) of the GDPR to exercise some or all of a data subject’s rights under Articles 77, 78 and 79 of the GDPR without being authorised to do so by the data subject), and
(d) the merits of making equivalent provision in relation to data subjects’ rights under Article 82 of the GDPR (right to compensation).
(3) “The review period” is the period of 30 months beginning when section 183 comes into force.
(4) After the report under subsection (1) is laid before Parliament, the Secretary of State may by regulations—
(a) exercise the powers under Article 80(2) of the GDPR in relation to England and Wales and Northern Ireland, and
(b) make provision enabling a body or other organisation which meets the conditions in Article 80(1) of the GDPR to exercise a data subject’s rights under Article 82 of the GDPR in England and Wales and Northern Ireland without being authorised to do so by the data subject.
(5) The powers under subsection (4) include power—
(a) to make provision enabling a data subject to prevent a body or other organisation from exercising, or continuing to exercise, the data subject’s rights;
(b) to make provision about proceedings before a court or tribunal where a body or organisation exercises a data subject’s rights,
(c) to make provision for bodies or other organisations to bring proceedings before a court or tribunal combining two or more claims in respect of a right of a data subject;
(d) to confer functions on a person, including functions involving the exercise of a discretion;
(e) to amend sections164 to166,177,183,196,198 and199;
(f) to insert new sections and Schedules into Part 6 or 7;
(g) to make different provision in relation to England and Wales and in relation to Northern Ireland.
(6) The provision mentioned in subsection (5)(b) and (c) includes provision about—
(a) the effect of judgments and orders;
(b) agreements to settle claims;
(c) the assessment of the amount of compensation;
(d) the persons to whom compensation may or must be paid, including compensation not claimed by the data subject;
(e) costs.
(7) Regulations under this section are subject to the affirmative resolution procedure.”
This new clause imposes a duty on the Secretary of State to review the operation of provisions enabling a representative body to exercise data subjects’ rights with their authority in England and Wales and Northern Ireland and to consider exercising powers under the GDPR to enable a representative body to exercise such rights there without being authorised to do so by the data subjects.—(Margot James.)
Brought up, read the First and Second time, and added to the Bill.
New Clause 5
Bill of Data Rights in the Digital Environment
Schedule [Bill of Data Rights in the Digital Environment] shall have effect.
This new clause would introduce a Bill of Data Rights in the Digital Environment.—(Liam Byrne.)
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
New clause 6—
“Bill of Data Rights in the Digital Environment (No. 2)
‘(1) The Secretary of State shall, by regulations, establish a Bill of Data Rights in the Digital Environment.
(2) Before making regulations under this section, the Secretary of State shall—
(a) consult—
(i) the Commissioner,
(ii) trade associations,
(iii) data subjects, and
(iv) persons who appear to the Commissioner or the Secretary of State to represent the interests of data subjects; and
(b) publish a draft of the Bill of Rights.
(3) The Bill of Data Rights in the Digital Environment shall enshrine—
(a) a right for a data subject to have privacy from commercial or personal intrusion,
(b) a right for a data subject to own, curate, move, revise or review their identity as founded upon personal data (whether directly or as a result of processing of that data),
(c) a right for a data subject to have their access to their data profiles or personal data protected, and
(d) a right for a data subject to object to any decision made solely on automated decision-making, including a decision relating to education and employment of the data subject.
(4) Regulations under this section are subject to the affirmative resolution procedure.”
This new clause would empower the Secretary of State to introduce a Bill of Data Rights in the Digital Environment.
New Schedule 1 Bill of Data Rights in the Digital Environment—
1 The UK recognises the following Data Rights:
Article 1 —Equality of Treatment
1 Every data subject has the right to fair and equal treatment in the processing of his or her personal data.
Article 2 — Security
1 Every data subject has the right to security and protection of their personal data and information systems.
Access requests by government must be for the purpose of combating serious crime and subject to independent authorisation.
Article 3 — Free Expression
1 Every data subject has the right to deploy his or her personal data in pursuit of their fundamental rights to freedom of expression, thought and conscience.
Article 4 — Equality of Access
1 Every data subject has the right to access and participate in the digital environment on equal terms.
Internet access should be open.
Article 5 — Privacy
1 Every data subject has right to respect for their personal data and information systems and as part of his or her fundamental right to private and family life, home and communications.
Article 6 — Ownership and Control
1 Every data subject is entitled to know the purpose for which personal data is being processed to exercise his or her right to ownership. Government, corporations and data controllers must obtain meaningful consent for use of people’s personal data.
Every data subject has the right to own and control his or her personal data.
Every data subject is entitled to proportionate share of income or other benefit derived from his or her personal data as part of the right to own.
Article 7 — Algorithms
1 Every data subject has the right to transparent and equal treatment in the processing of his or her personal data by an algorithm or automated system.
Every data subject is entitled to meaningful human control in making significant decisions – algorithms and automated systems must not be deployed to make significant decisions.
Article 8 — Participation
1 Every data subject has the right to deploy his or her personal data and information systems to communicate in pursuit of the fundamental right to freedom of association.
Article 9 — Protection
1 Every data subject has the right to safety and protection from harassment and other targeting through use of personal data whether sexual, social or commercial.
Article 10 — Removal
1 Every data subject is entitled to revise and remove their personal data.
Compensation
Breach of any right in this Bill will entitle the data subject to fair and equitable compensation under existing enforcement provisions. If none apply, the Centre for Data Ethics will establish and administer a compensation scheme to ensure just remedy for any breaches.
Application to Children
1 The application of these rights to a person less than 18 years of age must be read in conjunction with the rights set out in the United Nations Convention on the Rights of the Child.
1 Where an information society service processes data of persons less than 18 years of age it must do so under the age appropriate design code.”
We now come to the good stuff. Members of the Committee can look forward to an enormous amount of ground to cover in the debates ahead. We will try to speed through it as quickly as we can, but there is an awful lot of ground to cover. New clauses 5 and 6 and new schedule 1, tabled in my name and that of my hon. Friends, are an attempt to provoke the Government into being more ambitious in their strategy for the digital world. Every so often, as a great nation, we make important declarations of rights.
Rights are important because they ensure that progress is democratised, but they also provide important new protections against new imbalances of power that arise. We really began to turn our minds to this about 803 years ago when we came up with Magna Carta. We then made a much more sweeping and important statement that received Royal Assent on 16 December 1689. We had a couple of centuries off and in more recent years we went rights crazy and started signing universal declarations in the years after the second world war with much greater speed. We had the universal declaration of human rights, in which British civil servants took a leading role; the UN convention on the rights of the child; the charter of fundamental rights, which we helped shape; and the incorporation of those regimes of rights, which we wrote for our neighbours, into British law through the Human Rights Act 1998 and the Equality Act 2010.
Over the years, the regime of rights that we have pioneered in this country has been absolutely fundamental to the progress that we have made as a nation. If we go back to the debates here in the 1630s and 1640s, we see that the rights of new entrepreneurs to defend the wealth that they had created through trading, particularly in the Atlantic colonies—examples include the Virginia Company and, later, the East India Company—and the rights that we sought to enshrine and protect against arbitrary taxation, were absolutely fundamental in laying the foundation for the industrial revolution that really began to take off in the years after the Bill of Rights was enshrined by William III in 1689.
The argument that I want to make this morning is that the sweeping changes of the digital age mean that it would be wise of us to consider a similarly ambitious set of rights for the digital age. Anyone who has an interest in economic history will know that, ultimately, we can never contract for anything. Ultimately, a handshake will always be as important as a contract, and a handshake relies on an environment of trust. When countries do not have environments of trust, they lack economic institutions that allow their economies to flourish.
The challenge in this country today is that we are not making quite as much progress with the digital economy as perhaps we could be. Indeed, in most international indexes, where we should be at the top, we are normally batting at about fifth and sixth. That is not terrible, but most of us would like it to be better. We are the home of the scientific revolution and the industrial revolution. We should be at the top of the table, not fifth and sixth.
That provokes us to ask what is the state of online trust and digital trust in this country. The figures that I have dug out are for the time before the scandals that we have learned about over the last couple of weeks, which will not have put trust levels up. Online fraud is now growing very quickly. In fact, Action Fraud says that 70% of all fraud is now cyber-enabled. That is not simply a commercial problem; it is also a public sector problem. Public services such as the NHS hold vast quantities of public data. The NHS has been hit very badly by malware in a way that has provoked real questions about the UK’s digital resilience. The National Audit Office said that the NHS and the then Department of Health must “get their act together” or suffer far worse than the chaos of 2017. Edelman recently produced a survey that said that one quarter of the UK population trusts social media and 61% trust traditional media, so there are huge imbalances in what people trust today.
I have been interested in this question for a while, and I have been interested in seeing what we can learn from some of the world’s digital leaders. On a recent visit to Estonia, which is by some agreement the world’s leading digital society, the thing that really struck me was the fact that digital trust is supremely high. The Government of Estonia took the big decision, when they left that north-west corner of the USSR, that they would have to take a big gamble on the future. As we leave the north-west corner of Europe, we need to be taking a similar big bet on the future. We need to be betting on digital in the way we bet on steam a couple of centuries ago.
Two things are absolutely key to the digital environment in Estonia. One is a platform called X-Road, which allows Government data from distributed databases to come together to answer particular kinds of problems, but absolutely fundamental is the public option of an e-ID scheme. That involves two-factor authentication and it comes with important features such as the ability for people to look online at who has been using their data, who has been accessing it, and what they have been using it for. In fact, doctors and police officers have gone to jail because they have misused their ability to access online records—medical records, for instance.
Anyone in this country who has tried to file their taxes online, as I did early in January, will know that the Government gateway here is nowhere near that level. Once I had been issued with my fifth online ID, I frankly gave up and rang the MPs’ hotline, and the person there said, “Yeah, we’ve had lots of problems like this. You can just file your tax return on paper like everybody else.” We are sadly lacking the kind of digital infrastructure that many other countries enjoy.
The point about the public option for electronic ID is that there is a country that has decided that the right to a secure ID is a fundamental right, and on that fundamental right has flourished a digital economy that has helped to create the world’s leading digital society. There are now 3,000 Government e-services and 5,000 private sector e-services that sit on top of that platform. When I met the former Prime Minister of Estonia, he said that the key to winning the argument was that financial institutions such as banks were so confident in the public infrastructure that had been created that they were prepared to go out to the public in Estonia and say, “The public option for an electronic ID is the right option.”
I am enjoying the right hon. Gentleman’s history lesson about Estonia.
I had that sense. The key thing about Estonia, aside from the fact that it is a far, far smaller country, is that the register for the digital ID that the right hon. Gentleman is talking about is held centrally by the Government. There is a fundamental difference between this country and Estonia. If he were seriously to propose to citizens in the UK that the Government should hold that central register, I think they would give him pretty short shrift. In his long lecture, will he either make the case for a Government-held central register or acknowledge that it would still be a pretty tough thing to get past the British public?
I am very happy to. I am lucky enough to be able to draw on my extensive experience as the Minister for ID cards in the Labour Government. I will take the hon. Gentleman, in detail, through the architecture I proposed. Well, he asked for it.
The challenge we confronted in about 2006 is that we originally proposed one big database for all the data, including biometric data. That was an error. The architecture I proposed in its stead was a way of connecting three different databases—one that would have basically held Driver and Vehicle Licensing Agency data, a second that would have held the passport services data, and then a couple of identifiers that would have allowed those two records to be indexed and joined together. That brought the cost of the ID card system down by about two thirds.
Although the hon. Member for Boston and Skegness says that the British public would not like Government databases to hold all that information, that happens to be the country they live in. The Passport Office and DVLA hold comprehensive data on most people, and people find that extremely useful.
I was very careful about what I said. What I said was not that we should have compulsory e-ID, but that we should have a public option so people can choose to use it. That is obviously a different regime from Estonia’s, where ID cards have been compulsory since the country was invented about a century ago.
Giving people a public option would be quite attractive. There are, however, important safeguards that we need to learn from. It would be a mistake to have biometric information connected to that kind of service. We do not need biometric information connected to that kind of service. The ID card system in India has gone down that route, and it has suffered pretty significant leaks of biometric data over the past year and a half. If people get their hands on that data, that will be far more dangerous. The Estonian system, in which people have an electronic ID and a password that sits in their head—a two-factor authentication—has proven much more successful.
My broader point is that we should have a debate about the data rights that we, as citizens of this country, should have. Partly, that is about having rights to things that would make our lives better and would allow us to pursue new freedoms, such as the freedom not to have a million and one passwords, which we lose track of. It is also about having certain protections. We have had a useful debate, and will have an even longer one shortly, about the right to be treated fairly by algorithms. That is obviously incredibly important. The Government have given a nod in that direction, so the Minister will probably say a little about their digital charter.
On the different sides of the House, there are different philosophies on rights. The Conservative party traditionally defends rights to do with negative freedoms, and my side often talks the language of positive freedoms—the power to do things, which we think is necessary for social justice. However, I hope that in the months ahead we can have a sensible conversation about what negative and positive freedoms we can crystallise and enshrine in a bill of digital rights. At some point in this century, we shall write that. It is inevitable, because the world will change in a way that requires it, and the citizens of this country will begin to demand it. What we are starting to debate today is what will come to pass at some point. I hope to be the Minister who drives it through in the next Labour Government, which is imminent.
I hope, too, that we can debate that idea and help to perfect it. Where regimes of rights have been most effective, they have stood the test of time. For something to stand the test of time, it always helps if there is a little—not too much—cross-party consensus.
The new schedule has a couple of ideas at its core, and we are lucky in having been able to draw on not only the rights literature, but the incredible work of Baroness Kidron. As well as being a talented member of the creative industries, she has been one of the leading champions of the creation of strong digital rights for our children. As we have rehearsed in Committee previously, the issue is fundamental, not marginal. About a third of online users are children. The Government will have, in a way, to step in that direction. They will have to step towards new clauses 5 and 6, and new schedule 1, because they have committed to issuing an age-appropriate design code that will operationalise clause 124. I want to encourage the Government to think creatively about the way they will write the code of practice on age-appropriate design codes, with at least one eye on the broader bill of data and digital rights, which we want to propose.
The 5Rights movement has a couple of important ideas. One is the right to remove: children should be able to remove content that they have uploaded. There are probably members of the Committee who have posted all kinds of unfortunate content in their lives, which they might not want to have there in the future. That is certainly true of many children I know. The right to remove is, I think, widely accepted, and is reflected as one of the ambitions of the Bill.
The second right is the right to know. Children should be able to learn easily the who, what and why—and know for what purposes their data is being exchanged. That is important. The Minister herself has talked about the need to educate online users—to educate us all, so that we become better critical consumers of the content that we find online. That is doubly important for children.
The third right is the right to safety and support. Much of what upsets young people online is not illegal. It is legal. Support is often quite sparse and fragmented. It is often pretty invisible to children and young people when they need it most.
It will be challenging for the Government to turn the right to informed and conscious use into part of the code of practice, but that is incredibly important. It is simply unfortunate that social media firms spend quite so much money, effort and engineering talent on creating features that create a kind of addiction because of the rush of endorphins that they trigger in young people’s minds.
Those technologies, techniques and tricks of the trade are based on exactly the same principles as casino slot machines, and it is quite telling that a number of social media leaders have, over the last six months, gone on the record to say that they will not let their children use the apps that millions of children around the world use. The right to informed and conscious use will be difficult for the Government to interpret, but it is none the less important.
The right to digital literacy is perhaps the most important of all. It is something that our schools already do a terrific job of putting into practice, but what struck me in Estonia is the way that people see the right to internet access as basically a social right. That is surely something that we should debate and put in practice, too.
We have had quite a collection of evidence over the last year from people such as the Children’s Commissioner, who have ridden in behind and supported Baroness Kidron’s 5Rights movement. The Children’s Commissioner recently said:
“The social media giants have simply not done enough to make children aware of what they are signing up to when they install an app or open an account.”
The idea that children can look at these pages and pages of terms and conditions and just click and agree to them is obviously nonsensical. Indeed, the Children’s Commissioner, when reflecting on that, said:
“Children have absolutely no idea that they are giving away the right to privacy or the ownership of their data or the material they post online.”
The Government have obviously sought to exercise their derogation under the GDPR and set the age of consent at 13, rather than 16, so the code of practice that the Minister has agreed to is really important.
We would like this bill of data rights to go alongside more effective mechanisms to ensure that those rights are enforceable. That is why we tabled our amendments to clause 80(2). We think it is impossible in today’s economic environment for ordinary citizens to take effective action against the biggest firms on earth. These five firms have a market capitalisation, although it is slightly less than it was, of about $2.5 trillion, so the idea that a humble citizen can take on some of these giants is nonsensical. We would therefore like this bill of data rights to sit alongside a much more effective, open and democratic form of class action.
I am really interested in the Minister’s observations on the rights we have set out. Article 1 of our proposed new schedule covers equality of treatment, which is enshrined in the GDPR. The GDPR is long—we have made incredible progress through it, article by article—and it is a miracle that we have arrived at page 123 of the Bill by Thursday afternoon, but that is a real testament to the skilful chairing of Mr Hanson and you, Mr Streeter. The principle of equality of treatment is written throughout every clause of the Bill. The point is that it is written through 200 clauses, so we think a basic statement of equality of treatment is a good place to start.
Article 2 covers the right to security, which is the subject of the Bill. Again, let us set that out in terms. Article 3 covers the right to free expression, which is something we have signed up to in articles of the European convention on human rights. It is something that we should set within the context of a bill of data rights. Article 4 covers the right of equality of access. Giving equal access to the digital environment is extremely important. The digital environment creates a network, and network effects mean that the more people joined to it, the greater the value of the network. It is important to specify, set out and declare that we see equality of access to the digital environment as important.
Article 5 sets out the right to privacy, which, again, is scattered throughout the Bill, although we would like to consolidate and crystallise it and bring it together. Article 6 covers ownership and control, which will only grow in importance. This is not the place to get into the vexed debate about who owns the copyright to the data that someone might have and the new data that might be created by joining that data with someone else’s. However, the question of who owns the copyright, and therefore who owns the value of data that is personal in origin, is only going to grow. That debate is almost the 21st century equivalent to that on the enclosure of the commons, frankly. Who owns the copyright of data will become more important as the value of data grows exponentially.
Article 7 talks about the right to fairness when it comes to automated decision making, which we will come to in the debate on algorithmic fairness. Algorithms are making more and more decisions in our lives. People have a right not to be treated unfairly as a result of those decisions. In the phrase used by my hon. Friend the Member for Cambridge, we cannot have a world in which yesterday’s injustice is hard-coded into tomorrow’s injustice. We think that ensuring a right to algorithmic fairness in our bill of data rights is important. The rights to participation, protection and removal are important too.
We have a long tradition of rights in this country; we are the world’s pioneers of them. It is because we have been that pioneer down the centuries that we are today the world’s fifth-biggest economy, but we are not the world’s leading digital society. It is an ambition of the Opposition that we should be, and we think that a bill of digital rights would help us to get there.
I welcome new schedule 1, in the name of my right hon. Friend the Member for Birmingham, Hodge Hill and my hon. Friends the Members for Ogmore and for Sheffield, Heeley. I should declare that I was first on Facebook as a 19-year-old. Now, as a 31-year-old, I can declare that I do not think there is anything on there that I am embarrassed of.
I reserve the right for other hon. Friends to remove content from their social media.
I wanted to refer to the issue of data ownership. When we think of the world in terms of things that we own, there are legal bases for that ownership. We have a legal right to the houses that we buy, once the mortgage has been paid off, and we have a legal right to the clothes that we buy. However, we have no legal right to the ownership of the data about us or the data that we generate. In the context of people making money off the back of it, that feels fundamentally incorrect.
Even the language that we use suggests that the relationship is not balanced. The idea that Facebook is my data controller, and that I am merely its data subject, suggests that the tone of the conversation is incorrect. I support the fundamental principle of ownership, because I think that we need to have a much more fundamental debate about who owns this stuff. Why are people making money off the back of it? If they do things with our property that is against the law, or that incurs us a loss, we should have the right to enforce that principle.
We have seen that not just in the context of the personal data that we might create about the things we like to buy or the TV programmes we like to watch. Sir John Bell, in the report “Life sciences: industrial strategy”, talked about the value of NHS data. We are in a unique position in the world, because of our socialist healthcare system, where we have data for individuals in a large population across many years. That is extremely valuable to organisations and others. We on the Science and Technology Committee are doing reports at the moment on genomics data in the health service and on the regulation of algorithms. I recommend those reports, when they are published, to Members of the Bill Committee.
We need to try to avoid allowing, for example, health companies—I will not name any particular ones—to come into this country, access the data of NHS patients, build and train algorithms, and then take those algorithms to other parts of the world and make enormous profits off the back of them. But for the data that belongs to the British people, those businesses would not be able to make those profits.
I am trying to follow the hon. Gentleman’s train of thought. As I understand it, we have the largest digital economy in the G20—it is 12.4% of our GDP. He and the right hon. Member for Birmingham, Hodge Hill have experience of the industry. You do want to promote technology, as opposed to putting a thumb on it, don’t you?
Then you agree with hon. Members on both sides of the Committee, Mr Streeter. Of course we do, but as we have seen this week with the Cambridge Analytica scandal, rules must be set, and there must be a balance between allowing innovation to flourish and people’s rights not to be harmed in the process.
I agree—that is why I welcome the Bill. I am saying that we ought to go further, which is why I support the new schedule, and having conversations about ownership.
Returning to the issue of health data, I have personal views about how we might tax revenues from platforms in a better way. I welcome the comments made by the Chancellor of the Exchequer, in line with his counterparts in Europe, about looking at how we tax revenues where they are made, not where the company is headquartered. That is a positive move, but surely if all this NHS data is creating profits for other companies and organisations, we can create a situation in which patients also benefit from that, by sharing in the profits that are made and by seeing value redirected into the health service.
All that becomes anchored in the question of ownership. There is still this legal space that says that data subjects do not own their own data. We need a much broader debate on that. [Interruption.] Members are shaking their heads. I am happy to take interventions, if Members would like.
Will my hon. Friend reflect on the idea that if someone is genuinely a popular capitalist and believes in the distribution of wealth as the basis of economic growth, then recognising and crystallising the value of personal data is actually pro-growth?
I agree entirely. I confess I never got all the way through my version of Piketty, but the idea of value through assets, as opposed to through the stagnating wages in our economy today, plays into this conversation around data. People from poorer backgrounds may not inherit houses or land, but they create their own data every day. It is an asset that should belong to them. They should be able to share in its value when companies around the world are making enormous profits off the back of it. In this digital age, there is a huge call for equality of opportunity and equality of access. We need to try to get those right in these fundamental understandings of the digital market and the rights that exist around it.
Lastly, I encourage and strengthen my right hon. Friend’s arguments on the application of these principles to children. The Committee has already debated how parental consent is not needed after the age of 13. One of my early jobs as legal counsel at BT was the dubious task of consolidating terms and conditions. Hon. Members who are no doubt happy customers of BT, with perhaps broadband, TV and sport, would originally have had to read five or six different documents that were very long and complicated. I had to consolidate those. That was not good enough, so I commissioned a YouTube star to do a video, which can be seen on the terms and conditions page, to try to explain some of these things. Even for adults, this was a really hard and laborious task.
I am not saying that it is for Government to tell businesses how to communicate to children. Second Reading and some of the Committee’s debates show—dare I say it—that we are probably not best placed to have those conversations. However, it is really important that there is an expectation on businesses that they take steps to ensure that children are properly engaged and really understand what they are signing up to, especially as the Government have opted to go to the minimum age range for consent, going to 13.
I just wanted to re-emphasise the debate on ownership and on children. I support my right hon. Friend’s new schedule and new clauses, and I hope the Government will support them.
My response will encompass our digital charter, as the right hon. Member for Birmingham, Hodge Hill mentioned, and I will also answer some of the points he made in his interesting exposition of his rights-based approach. I agree with him: the internet is a powerful force for good, serving humanity and spreading ideas, freedom and opportunity across the world. Yet, as he rightly states, there are considerable trust issues, which can have only worsened in recent days.
I would like to emphasise the point made by my hon. Friend the Member for Gordon that the UK has a strong digital economy accounting for over 12.5% of GDP, which makes us the leading digital economy in the G20.
The right hon. Gentleman was critical of Government sites and services, but we have developed a system that is being taken up by several other countries, including New Zealand, which are adopting our approach to providing Government services online. I am sorry that his experience on the tax side was not great, and there are always exceptions, but on the whole we are leaders in the provision of Government services online.
Citizens rightly want to know that they will be safe and secure online. Tackling these challenges in an effective and responsible way is absolutely critical. The digital charter is our response. It is a rolling programme of work to agree norms and rules for the online world and to put them into practice. In some cases, that will be through shifting expectations of behaviour and resetting a settlement with internet companies. In some cases, we will need to agree completely new standards; in others, we will want to update our laws and regulations. Our starting point is that we expect the same rights and behaviour online as we do offline, with the same ease of enforcement.
The charter’s core purpose is to make the internet work for everyone—for citizens, businesses and society as a whole—and it is based on liberal values. Every country is grappling with these challenges. The right hon. Gentleman suggested last week that the Government are not averse to making declaratory statements of rights and interpreting them into law, but his key example related to human rights. The Human Rights Act provides a detailed and well-considered legislative framework for those rights and ensures that they are meaningful.
When the right hon. Member for Surrey Heath (Michael Gove), who is now the Secretary of State for Environment, Food and Rural Affairs, was Secretary of State at the Ministry of Justice, he launched a consultation about an English Bill of Rights, which was about not simply human rights but a much broader set of rights. I do not think there is a big difference in our approaches to rights. Actually, I think there is a shared approach, as has been recognised down the years.
Yes, much of our approach is shared. The Government decided not to proceed with that Bill of Rights, but the right hon. Gentleman rightly points out that both our parties have a keen interest in this area. However, to set out his proposed bill of data rights in primary legislation would cut across the GDPR. It would impose its own rights of rectification and erasure, its own notion of control and its own obligations on controllers to keep data secure, but, of course, the GDPR already does that, and comparable rights are provided for in the Bill. I am concerned about how the Commission would react to such an attempt to redefine data protection standards. That is one of our main concerns with his new clauses and new schedule, no matter how much we might agree with the sentiments behind them. Given that, and the fact that we are proceeding with our digital charter, I feel that the Bill, in essence, covers this issue, and I need say no more about it.
Our proposed bill of data rights seeks not to redefine but to enshrine, so the rights reflected in the GDPR are no more than enshrined in it. The point is that it would go over and above the rights and obligations set out in this Bill. The right of equal access to the internet, the crystallisation of the right to expression and the advancement of the debate about the right to data ownership are important provisions whose time will come. At some point, due to the way the world is changing, our citizens and constituents will begin to demand both a democratisation of the privileges of this new age and of progress, and the right to effective defences and new protections.
I am glad that the Minister agrees with the sentiment behind the new clause, and I recognise that she perhaps does not see this Bill as the place to consolidate our brilliant ideas into the law of the land. I listened with interest to what she said about a rolling programme of ideas in the digital charter. There is a challenge with that approach: it will end up following the cones hotline model of public service reform. It will not live or sing; it will be bedevilled by voluntary codes, bureaucracy and operational procedures, and it will end up not really making a difference to the world. Our bill of data rights is clear.
If rights are to be a reality, they need not to be a mystery but to be understood. They need to be something that people can talk about in a pub. They need to be something not that is set out in 250 pages of primary legislation but that can be set out on the back of a fag packet. In our bill of data rights, we set out a clear agenda that would make a difference and be easily understood and enforced. It would be an improvement and would take forward the rights and liberties of the citizens of this country.
No. I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
Ordered, That further consideration be now adjourned. —(Nigel Adams.)
(6 years, 9 months ago)
Public Bill CommitteesI beg to move, That the clause be read a Second time.
With this it will be convenient to discuss the following:
New clause 8—Application of the Equality Act (Employment)—
“(1) Part 5 (Employment) of the Equality Act (‘the Equality Act’) shall apply to the processing of personal data by an algorithm or automated system in making or supporting a decision under this section.
(2) A ‘decision’ in this section means a decision that engages a data subject (D)’s rights, freedoms or legitimate interests concerning—
(a) recruitment,
(b) the terms and conditions of employment,
(c) access to opportunities for promotion, transfer or training, and
(d) dismissal.
(3) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Act, the Equality Act or in any other primary or secondary legislation relating to D’s personal data, employment, social security or social protection.”
This new clause would apply Part 5 of the Equality Act 2010 to the processing of personal data by an algorithm or automated system or supporting a decision under this new clause.
New clause 9—Right to algorithmic fairness at work—
“(1) A person (“P”) has the right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision under this section.
(2) A “decision” in this section means a decision in which an algorithm or automated system is deployed to support or make a decision or any part of that decision that engages P’s rights, freedoms or legitimate interests concerning—
(a) recruitment,
(b) the terms and conditions of employment,
(c) access to opportunities for promotion, transfer or training, and
(d) dismissal.
(3) “Fair treatment” in this section means equal treatment between P and other data subjects relevant to the decision made under subsection (2) insofar as that is reasonably practicable with regard to the purpose for which the algorithm or automated system was designed or applied.
(4) In determining whether treatment of P is “fair” under this section the following factors shall be taken into account—
(e) the application of rights and duties under equality and other legislation in relation to any protected characteristics or trade union membership and activities,
(f) whether the algorithm or automated system has been designed and trained with due regard to equality of outcome,
(g) the extent to which the decision is automated,
(h) the factors and weighting of factors taken into account in determining the decision,
(i) whether consent has been sought for the obtaining, recording, using or disclosing of any personal data including data gathered through the use of social media, and
(j) any guidance issued by the Centre for Data Ethics and Innovation.
(5) “Protected characteristics” in this section shall be the protected characteristics defined in section 4 of the Equality Act 2010.”
This new clause would create a right to fair treatment in the processing of personal data by an algorithm or automated system in making a decision regarding recruitment, terms and conditions of employment, access to opportunities for promotion etc. and dismissal.
New clause 10—Employer’s duty to undertake an Algorithmic Impact Assessment—
‘(1) An employer, prospective employer or agent must undertake an assessment to review the impact of deploying the algorithm or automated system in making a decision to which subsection (1) of section [Application of Equality Act (Employment)] applies [an ‘Algorithmic Impact Assessment’].
(2) The assessment undertaken under subsection (1) must—
(a) identify the purpose for which the algorithm or automated system was designed or applied,
(b) test for potential discrimination or other bias by the algorithm or automated system,
(c) consider measures to advance fair treatment of data subjects relevant to the decision, and
(d) take into account any tools for Algorithmic Impact Assessment published by the Centre for Data Ethics and Innovation.”
This new clause would impose a duty upon employers to undertake an Algorithmic Impact Assessment.
New clause 11—Right to an explanation—
“(1) A person (“P”) may request and is entitled to be provided with a written statement from an employer, prospective employer or agent giving the following particulars of a decision to which subsection (1) of section [Right to algorithmic fairness at work] applies—
(a) any procedure for determining the decision,
(b) the purpose and remit of the algorithm or automated system deployed in making the decision,
(c) the criteria or other meaningful information about the logic involved in determining the decision, and
(d) the factors and weighting of factors taken into account in determining the decision.
(2) P is entitled to a written statement within 14 days of a request made under subsection (1).
(3) A complaint may be presented to an employment tribunal on the grounds that—
(a) a person or body has unreasonably failed to provide a written statement under subsection (1),
(b) the particulars given in purported compliance with subsection (1) are inadequate,
(c) an employer or agent has failed to comply with its duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment],
(d) P has not been treated fairly under section [Right to algorithmic fairness at work].
(4) Where an employment tribunal finds a complaint under this section well-founded the tribunal may—
(e) make a declaration giving particulars of unfair treatment,
(f) make a declaration giving particulars of any failure to comply with duties under section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],
(g) make a declaration as to the measures that ought to have been undertaken or considered so as to comply with the requirements of subsection (1) or section [Employer’s duty to undertake an Algorithmic Impact Assessment] or section [Right to algorithmic fairness at work],
(h) make an award of compensation as may be just and equitable.
(5) An employment tribunal shall not consider a complaint presented under subsection (3) in a case where the decision to which the reference relates was made—
(i) before the end of the period of 3 months, or
(j) within such further period as the employment tribunal considers reasonable in a case where it is satisfied that it was not reasonably practicable for the application to be made before the end of that period of 3 months.
(6) Nothing in this section detracts from other rights, freedoms or legitimate interests in this Bill or any other primary or secondary legislation relating to P’s personal data, employment, social security or social protection.”
This new clause would create a right to an explanation in writing from an employer, prospective employer or agent giving the particulars of a decision to which the Right to algorithmic fairness at work applies.
New clauses 7 and 8 to 11 touch on the question of how we ensure a degree of justice when it comes to decisions that are taken about us automatically. The growth in decisions that are made through automated decision making has been exponential, and there are risks to that. We need to ensure that the law is modernised to provide new protections and safeguards for our constituents in this new world.
I should say at the outset that this group of new clauses is rooted in the excellent work of the Future of Work commission, which produced a long, thought-provoking report. The Committee will be frustrated to hear that I am not going to read through that this afternoon, but, none the less, I want to tease out a couple of points.
The basket of new clauses that we have proposed are well thought through and have been carefully crafted. I put on record my thanks to Helen Mountfield QC, an expert in equality law, and to Mike Osborne, professor of machine learning. Along with Ben Jaffey QC, a specialist in data law, they have been looking at some of the implications of automated decision making, which were discussed at length by the Future of Work commission.
Central to the new clauses is a concern that unaccountable and highly sophisticated automated or semi-automated systems are now making decisions that bear on fundamental elements of people’s work, including recruitment, pay and discipline. Just today, I was hearing about the work practices at the large Amazon warehouse up in Dundee, I think, where there is in effect digital casualisation. Employees are not put on zero-hours contracts, but they are put on four-hour contracts. They are guided around this gigantic warehouse by some kind of satnav technology on a mobile phone, but the device that guides them around the warehouse is also a device that tracks how long it takes them to put together a basket.
That information is then arranged in a nice league table of employees of who is the fastest and who is slowest, and decisions are then taken about who gets an extension to their contracted hours each week and who does not. That is a pretty automated kind of decision. My hon. Friend the Member for Eltham (Clive Efford) was describing to me the phenomenon of the butty man—the individual who decided who on a particular day got to work on the docks or on the construction site. In the pub at the end of the week, he divvied up the earnings and decided who got what, and who got work the following week. That kind of casualisation is now being reinvented in a digital era and is something that all of us ought to be incredibly concerned about.
What happens with these algorithms is called, in the jargon, socio-technical—what results is a mixture of conventional software, human judgment and statistical models. The issue is that very often the decisions that are made are not transparent, and are certainly not open to challenge. They are now quite commonly used by employers and prospective employers, and their agents, who are able to analyse very large datasets and can then deploy artificial intelligence and machine learning to make inferences about a person. Quite apart from the ongoing debates about how we define a worker and how we define employment—the subject of a very excellent report by my old friend Matthew Taylor, now at the RSA—there are real questions about how we introduce new safeguards for workers in this country.
I want to highlight the challenge with a couple of examples. Recent evidence has revealed how many recruiters use—surprise, surprise—Facebook to seek candidates in ways that routinely discriminate against older workers by targeting advertisements for jobs in a particular way. Slater and Gordon, which is a firm of excellent employment lawyers, showed that about one in five company executives admit to unlawful discrimination when advertising jobs online. The challenge is that when jobs are advertised in a targeted way, by definition they are not open to applicants from all walks of life, because lots of people just will not see the ads.
Women and those over the age of 50 are now most likely to be prevented from seeing an advert. Some 32% of company executives say that they have discriminated against those who are over 50, and a quarter have discriminated in that way against women. Nearly two thirds of executives with access to a profiling tool have said that they use it to actively seek out people based on criteria as diverse as age, gender and race. If we are to deliver a truly meritocratic labour market, where the rights of us all to shoot for jobs and to develop our skills and capabilities are protected, some of those practices have to stop. If we are to stop them, the law needs to change, and it needs to change now.
This battery of new clauses sets out to do five basic things. First, they set out some enhancements and refinements to the Equality Act 2010, in a way that ensures that protection from discrimination is applied to new forms of decision making, especially when those decisions engage core rights, such as rights on recruitment, terms of work, or dismissal. Secondly, there is a new right to algorithmic fairness at work, to ensure equal treatment. Thirdly, there is the right to an explanation when a decision is taken in a way that affects core elements of work life, such as a decision to hire, fire or suspend someone. Fourthly, there is a new duty for employers to undertake an algorithmic impact assessment, and fifthly, there are new, realistic ways for individuals to enforce those rights in an employment tribunal. It is quite a broad-ranging set of reforms to a number of different parts of legislation.
My right hon. Friend is making a powerful case. Does he agree that this is exactly the kind of thing we ought to have been discussing at the outset of the Bill? The elephant in the room is that the Bill seems to me, overall, to be looking backwards rather than forwards. It was developed to implement the general data protection regulation, which has been discussed over many years. We are seeing this week just how fast-moving the world is. These are the kind of ideas that should have been driving the Bill in the first place.
Exactly. My hon. Friend makes such a good point. The challenge with the way that Her Majesty’s Government have approached the Bill is that they have taken a particular problem—that we are heading for the exit door of Europe, so we had better ensure that we get a data-sharing agreement in place, or it will be curtains for Britain’s services exports—and said, “We’d better find a way of incorporating the GDPR into British law as quickly as possible.” They should have thought imaginatively and creatively about how we strengthen our digital economy, and how we protect freedoms, liberties and protections in this new world, going back to first principles and thinking through the consequences. What we have is not quite a cut-and-paste job—I will not describe it in that way—but neither is it the sophisticated exercise in public law making that my hon. Friend describes as more virtuous.
I want to give the Committee a couple of examples of why this is so serious, as sometimes a scenario or two can help. Let us take an individual whom we will call “Mr A”. He is a 56-year-old man applying for website development roles. Typically, if someone is applying for jobs in a particular sector, those jobs will be advertised online. In fact, many such roles are advertised only online, and they target users only in the age profile 26 to 35, through digital advertising or social media networks, whether that is Facebook, LinkedIn, or others. Because Mr A is not in the particular age bracket being targeted, he never sees the ad, as it will never pop up on his news feed, or on digital advertising aimed at him. He therefore does not apply for the role and does not know he is being excluded from applying for the role, all as a consequence of him being the wrong age. Since he is excluded from opportunities because of his age, he finds it much harder to find a role.
The Equality Act, which was passed with cross-party consensus, prohibits less favourable treatment because of age—direct discrimination—including in relation to recruitment practices, and protects individuals based on their age. The Act sets out a number of remedies for individuals who have been discriminated against in that way, but it is not clear how the Bill proposes to correct that sin. Injustices in the labour market are multiplying, and there is a cross-party consensus for a stronger defence of workers. In fact, the Member of Parliament for the town where I grew up, the right hon. Member for Harlow (Robert Halfon), has led the argument in favour of the Conservative party rechristening itself the Workers’ party, and the Labour party was founded on a defence of labour rights, so I do not think this is an especially contentious matter. There is cross-party consensus about the need to stand up for workers’ rights, particularly when wages are stagnating so dramatically.
We are therefore not divided on a point of principle, but the Opposition have an ambition to do something about this growing problem. The Bill could be corrected in a way that made a significant difference. There is not an argument about the rights that are already in place, because they are enshrined in the Equality Act, with which Members on both sides of the House agree. The challenge is that the law as it stands is deficient and cannot be applied readily or easily to automated decision making.
My right hon. Friend is making a powerful case about the importance of the Equality Act in respect of the Bill, but may I offer him another example? He mentioned the Amazon warehouse where people are tracked at work. We know that agencies compile lists of their more productive workers, whom they then use in other work, and of their less productive workers. That seems like a form of digital blacklisting, and we all know about the problems with blacklisting in the construction industry in the 1980s. I suggest that the new clauses are a great way of combating that new digital blacklisting.
My hon. Friend gives a brilliant example. The point is that employment agencies play an incredibly important role in providing workers for particular sectors of the economy, from hotels to logistics, distribution and construction. The challenge is that the areas of the economy that have created the most jobs in the 10 years since the financial crash are those where terms and conditions are poorest, casualisation is highest and wages are lowest—and they are the areas where productivity is poorest, too. The Government could take a different kind of labour market approach that enhanced productivity and wages, and shut down some of the bad practices and casualisation that are creating a problem.
As it happens, the Government have signed up to some pretty big ambitions in that area. Countries around the world recently signed up to the UN sustainable development goals. Goal 8 commits the Government to reducing inequality, and SDG 10 commits them to reducing regional inequality. However, when I asked the Prime Minister what she was doing about that, my question was referred to Her Majesty’s Treasury and the answer that came back from the Chancellor was, “We believe in raising productivity and growth.” The way to raise productivity and growth is to ensure that there are good practices in the labour market, because it is poor labour market productivity that is holding us back as a country.
If digital blacklisting or casualisation were to spread throughout the labour market in the sectors that happen to be creating jobs, there would be no increase in productivity and the Government would be embarked on a self-defeating economic policy. Although these new clauses may sound technical, they have a bearing on a much more important plank of the Government’s economic development strategy.
Our arguments are based on principles that have widespread support on both sides of the House and they are economically wise. The consequences of the new clauses will be more than outweighed by the benefits they will deliver. I commend them to the Minister and I hope she will take them on board.
I want to add some further comments in support of the new clauses.
The Science and Technology Committee, one of the two Committees that I sit on, has had a detailed debate on algorithmic fairness. It is important to understand what the new clauses seek to do. There is a nervousness about regulating algorithms or making them completely transparent, because there are commercial sensitivities in the coding in respect of the way they are published or otherwise.
These new clauses seek to put the obligation on to the human beings who produce the algorithms to think about things such as equalities law to ensure that we do not hardcode biases into them, as my hon. Friend the Member for Cambridge said on Second Reading. It is important to understand how the new clauses apply to the inputs—what happens in the black box of the algorithm—and the outputs. The inputs to an algorithm are that a human codes and sets its rules, and that they put the data into it for it to make a decision.
The new clauses seek to say that the human must have a consistent and legal obligation to understand the equalities impacts of their coding and data entry into the black box of the algorithm to avoid biases coming out at the other end. As algorithms are increasingly used, that is an important technical distinction to understand, and it is why the new clauses are very sensible. On that basis, I hope the Government will support them.
Thank you, Mr Streeter, and what a wonderful birthday present it is to be serving on the Committee.
It is a joy, actually, to be able to agree with the Opposition on the principle that equality applies not only to decisions made by human beings or with human input, but to decisions made solely by computers and algorithms. On that, we are very much agreed. The reason that we do not support the new clauses is that we believe that the Equality Act already protects workers against direct or indirect discrimination by computer or algorithm-based decisions. As the right hon. Member for Birmingham, Hodge Hill rightly said, the Act was passed with cross-party consensus.
The Act is clear that in all cases, the employer is liable for the outcome of any of their actions, or those of their managers or supervisors, or those that are the result of a computer, algorithm or mechanical process. If, during a recruitment process, applications from people with names that suggest a particular ethnicity were rejected for that reason by an algorithm, the employer would be liable for race discrimination, whether or not they designed the algorithm with that intention in mind.
The right hon. Gentleman placed a great deal of emphasis on advertising and, again, we share his concerns that employers could seek to treat potential employees unfairly and unequally. The Equality and Human Rights Commission publishes guidance for employers to ensure that there is no discriminatory conduct and that fair and open access to employment opportunities is made clear in the way that employers advertise posts.
The same principle applies in the provision of services. An automated process that intentionally or unintentionally denies a service to someone because of a protected characteristic will lay the service provider open to a claim under the Act, subject to any exceptions.
I am grateful to the Minister for giving way, not least because it gives me the opportunity to wish her a happy birthday. Could she remind the Committee how many prosecutions there have been for discriminatory advertising because employers chose to target their adverts?
If I may, I will write to the right hon. Gentleman with that precise number, but I know that the Equality and Human Rights Commission is very clear in its guidance that employers must act within the law. The law is very clear that there are to be no direct or indirect forms of discrimination.
The hon. Member for Cambridge raised the GDPR, and talked about looking forwards not backwards. Article 5(1)(a) requires processing of any kind to be fair and transparent. Recital 71 draws a link between ensuring that processing is fair and minimising discriminatory effects. Article 35 of the GDPR requires controllers to undertake data protection impact assessments for all high-risk activities, and article 36 requires a subset of those impact assessments to be sent to the Information Commissioner for consultation prior to the processing taking place. The GDPR also gives data subjects the tools to understand the way in which their data has been processed. Processing must be transparent, details of that processing must be provided to every data subject, whether or not the data was collected directly from them, and data subjects are entitled to a copy of the data held about them.
When automated decision-making is engaged there are yet more safeguards. Controllers must tell the data subject, at the point of collecting the data, whether they intend to make such decisions and, if they do, provide meaningful information about the logic involved, as well as the significance and the envisaged consequences for the data subject of such processing. Once a significant decision has been made, that must be communicated to the data subject, and they must be given the opportunity to object to that decision so that it is re-taken by a human being.
We would say that the existing equality law and data protection law are remarkably technologically agnostic. Controllers cannot hide behind algorithms, but equally they should not be prevented from making use of them when they can do so in a sensible, fair and productive way.
Going back to the point raised by my right hon. Friend, I suspect that the number of cases will prove to be relatively low. The logic of what the Minister is saying would suggest that there is no algorithmic unfairness going on out there. I do not think that that is the case. What does she think?
I would be guided by the view of the Equality and Human Rights Commission, which oversees conduct in this area. I have no doubt that the Information Commissioner and the Equality and Human Rights Commission are in regular contact. If they are not, I very much hope that this will ensure that they are.
We are clear in law that there cannot be such discrimination as has been discussed. We believe that the framework of the law is there, and that the Information Commissioner’s Office and the Equality and Human Rights Commission, with their respective responsibilities, can help, advise and cajole, and, at times, enforce the law accordingly. I suspect that we will have some interesting times ahead of us with the release of the gender pay gap information. I will do a plug now, and say that any company employing more than 250 employees should abide by the law by 4 April. I look forward to reviewing the evidence from that exercise next month.
We are concerned that new clauses 7 and 8 are already dealt with in law, and that new clauses 9 to 11 would create an entirely new regulatory structure just for computer-assisted decision-making in the workplace, layered on top of the existing requirements of both employment and data protection law. We want the message to be clear to employers that there is no distinction between the types of decision-making. They are responsible for it, whether a human being was involved or not, and they must ensure that their decisions comply with the law.
Having explained our belief that the existing law meets the concerns raised by the right hon. Member for Birmingham, Hodge Hill, I hope he will withdraw the new clause.
I think it was in “Candide” that Voltaire introduced us to the word “Panglossian”, and we have heard a rather elegant and Panglossian description of a perfect world in which all is fine in the labour market. I am much more sceptical than the Minister. I do not think the current law is sufficiently sharp, and I am concerned that the consequence of that will be injustice for our constituents.
The Minister raised a line of argument that it is important for us to consider. The ultimate test of whether the law is good enough must be what is actually happening out there in the labour market. I do not think it is good enough; she thinks it is fine. On the nub of the argument, a few more facts might be needed on both sides, so we reserve the right to come back to the issue on Report. This has been a useful debate. I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 13
Review of Electronic Commerce (EC Directive) Regulations
“(1) The Secretary of State shall lay before both Houses of Parliament a review of the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of personal data.
(2) A review under subsection (1) shall be laid before Parliament by 31 January 2019.”—(Liam Byrne.)
This new clause would order the Secretary of State to review the application and operation of the Electronic Commerce (EC Directive) Regulations 2002 in relation to the processing of data and lay that review before Parliament before 31 January 2019.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
This is not normally my practice, but let me raise another area that is subject to a measure of cross-party consensus. There is widespread recognition that the e-commerce directive, which is used to regulate information services providers, is hopelessly out of date. It was agreed in around 2000. In effect, it allows information services providers to be treated as platforms rather than publishers. Since then, we have seen the growth of big tech and the new data giants that now dominate the digital economy, and they are misbehaving. Worse, they have become platforms for hate speech, social division and interference in democracy. It was intriguing to hear Mark Zuckerberg himself admit in the interview he gave yesterday that Facebook was indeed being used to try to corrupt elections. That is an extraordinary recognition by the head of one of the most important firms in the world.
The Secretary of State for Digital, Culture, Media and Sport reminded us as recently as this morning that as we come out of the European Union we will have a new opportunity to update the e-commerce directive. The House basically must put in place a new framework to regulate information services providers in a new way. A debate is raging among our neighbours about what steps we need to take to shut down the hate speech that is dividing communities, and we need to get into that debate quickly. Germany recently passed laws that require companies such as Facebook to take down hate speech in a very short time window or face fines of up to €10 million and Ireland has created a new regulator to provide a degree of overwatch, so it is intriguing that we are falling behind some of our most important neighbours, who now lead this debate.
I began looking at this issue when I started researching new techniques in ISIS propaganda. In the excellent Scotland Yard counter-terrorism referral unit, I saw propaganda that was put together with the slickness of a pop video to incite people to commit the most heinous crimes, such as the one we commemorate today. Yet I think we all recognise that organisations such as Facebook and YouTube are simply not working quickly enough to take down that kind of material, which we simply do not want people to see. I congratulate The Times, which has run a forensic campaign to shine a light on some of that bad practice. It is good finally to see advertisers such as Unilever beginning to deliver a tougher message to social media platforms that enough is enough.
We know we have to modernise those regulations. The commercial world and politicians on both sides are saying, “Enough is enough.” We all fear the consequences of things going wrong with respect to the destabilisation of democracy in America—but not just in America. We have seen it across the Baltics, in France, in Germany, across southern Europe and in eastern Europe. Among our NATO allies, we can see a vulnerability to our enemies using social media platforms to sow division.
I agree with everything the right hon. Gentleman has said, except that I do not think the Bill is the place for his proposals. The e-commerce directive and the Electronic Commerce (EC Directive) Regulations 2002, which transpose it into UK law, regulate services that are
“normally provided for remuneration, at a distance, by means of electronic equipment…and at the individual request of a recipient of a service”.
Those services are known as information society services.
However, questions relating to the processing of personal data by information society services are excluded from the scope of the e-commerce directive and hence excluded from the scope of the 2002 regulations. That is because the processing of personal data is regulated by other instruments, including, from May, the GDPR. The review of the application and operation of the 2002 regulations solely in relation to the processing of personal data, as proposed by new clause 13, would therefore be a speedy review to undertake.
However, that does not address the substance of the right hon. Gentleman’s concern, which we have already discussed in a delegated legislation Committee earlier this month. As I said then, the Government are aware of his concern that the e-commerce directive, finalised in 2000, is now outdated, in particular with regard to its liability provisions.
Those provisions limit, in specified circumstances, the liability that service providers have for the content on their sites. That includes social media platforms where they act as hosts. Social media companies have made limited progress on a voluntary basis, removing some particularly harmful content quickly and, in recent years, consistently. However, as we have seen in the case of National Action and its abhorrent YouTube videos, and many other lower-profile cases, there is a long way to go. We do not rule out legislation.
The Government have made it clear through our digital charter that we are committed to making the UK the safest place to be online, as well as the best place to grow a digital business. As the Prime Minister has said, when we leave the EU we will be leaving the digital single market, including the e-commerce directive. That gives us an opportunity to make sure that we get matters corrected for the modern age: supporting innovation and growth, and the use of modern technology, but doing so in a way that commands the confidence of citizens, protects their rights and makes their rights as enforceable online as they currently are offline.
The UK will be leaving the digital single market, but we will continue to work closely with the EU on digital issues as we build up our existing strong relationship in the future economic partnership. We will work closely with a variety of partners in Europe and further afield. Alongside that, our internet safety strategy will tackle the removal of harmful but legal content. Through the introduction of a social media code of practice and annual transparency report, we will place companies under an obligation to respond quickly to user reports and to ensure that their moderation processes are fit for purpose, with statutory backing if required. We have demonstrated that in the example of the introduction of age verification for online pornography.
There is an important debate to be had on the e-commerce directive and on platform liability, and we are committed to working with others, including other countries, to understand how we can make the best of existing frameworks and definitions. Consideration of the Bill in Committee and on Report are not the right places for that wide debate to be had. For those reasons, I request that the right hon. Gentleman withdraw the clause.
I admire the Minister’s concern and ambition for administrative tidiness. She reminds me of an old quote by Bevin, who said once, “If you are a purist, the place for you is not a Parliament; it is a monastery.”
In the case of the Minister, a nunnery, although Bevin was less enlightened than the hon. Lady. Here is a Bill; here is a new clause; the new clause is within scope. The object of the new clause is to deliver a Government objective, yet it is rejected. That is hard logic to follow. We have had the tremendous assurance, however, that there will be nothing less than a code of practice, so these huge data giants will be shaking in their boots in California, when they wake up. They will be genuinely concerned and no doubt already planning how they can reform their ways and stop the malpractice that we have grown all too used to. I am afraid that these amount to a collection of warm words, when what the country needs is action. With that in mind, I will push the new clause to a vote.
Question put, That the clause be read a Second time.
I beg to move, That the clause be read a Second time.
This is another entirely sensible new clause, which I hope the Government will take on board, either at this stage or on Report. We rehearsed earlier in Committee the debate about the reality and challenges of the fact that our education providers are now collecting, managing and often losing significant amounts of very personal data relating to children.
Any of us who has children at school will know the joys of ParentPay, which means that schools are collecting biometric data on our children. We know that schools are keeping exam results and all kinds of records and evaluations about our children online. Given the complexity of the GDPR and some of the costs and questions around implementing it, the complexity of the education system means that we urgently need a code of practice that schools can draw on to help them get the GDPR right, and to help our educators in their task of keeping our children’s data safer than it is today.
In my argument, I will draw on the excellent contribution made on Second Reading by my noble Friend, Lord Knight, who said:
“Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting”
and so on. My noble Friend has long been an advocate of that kind of thing, but the point is that he knows, and the other place recognised, that the way school information systems operate means they are often cloud based and integrated into all sorts of other data systems. There will often be contracts in place with all sorts of education service providers, which will entail the transfer of data between, for example, a school and a third party. It could well be that that third party is based overseas. As my noble Friend said:
“Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 185.]
Lord Storey rode in behind my noble Friend, saying that
“young people probably need more protection than at any other time in our recent history.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 170.]
That is not something that has been debated only by the other place. UNICEF recently published a working paper entitled “Privacy, protection of personal information and reputation rights” and said it was now
“evident that children’s privacy differs both in scope and application from adults’ privacy”
but that they experience more threats than any other group. The “Council of Europe Strategy for the Rights of the Child (2016-2021)” echoed the same sentiment and observed:
“Parents and teachers struggle to keep up with technological developments”.
I have a number of friends who are teachers and headteachers. They listen to me in horror when I explain that I am the shadow Minister for the Data Protection Bill, because they know this is looming and they are absolutely terrified of it. Why is that? Because they are good people and good educators; they go into teaching because they want to change the world and change children’s lives, and they recognise the new obligations that are coming, but they also recognise the realities of how their schools operate today. Those people know about the proliferation of data that they and their staff are collecting. They know about the dangers and risks of that data leaking—not least because most teachers I know who have some kind of pastoral care responsibility seem to spend half their time having to advise their children about what not to do with social media apps and what not to post. They are often drawn in to disputes that rage out of control on social media platforms such as Instagram.
Teachers are very alert to the dangers of this new world. They are doing a brilliant and innovative job of supporting children through it, but they are crying out now for good guidance to help them to implement the GDPR successfully.
I echo my right hon. Friend’s points. My daughter is seven years old. I have an app on my phone that, at any time of the day, will tell me what she is doing at school. Her attendance, reward system, and school meal requirements are all recorded on it, and I can access it at any time. The school she goes to wants to keep a connection with parents, so that parents can interact comfortably. The new clause would go a long way towards allowing schools to keep that link, because the default position of schools, as I am sure my right hon. Friend would agree, is to protect children, even if that means not sharing information in the way that they would like to.
That sounds like a terrifying application; my hon. Friend’s daughter very much has my sympathies. He is absolutely right. Lord Knight made this point with such power in the other place. The technology is advancing so quickly, and schools know that if they can monitor things in new, more forensic ways, that helps them to do their job of improving children’s education. However, it has costs and consequences too. I hope that Her Majesty’s Government will look sympathetically on the task of teachers, as they confront this 200-and-heaven-knows-what-page Bill.
Does my right hon. Friend share my concerns that, in response to a number of written parliamentary questions that I tabled, it became clear that the Government gave access to the national pupil database, which is controlled by the Government, to commercial entities, including newspapers such as The Daily Telegraph?
Yes. My hon. Friend has done an extraordinary job of exposing that minor scandal. I am surprised that it has not had more attention in the House, but hopefully once the Bill has passed it is exactly the kind of behaviour that we can begin to police rather more effectively.
I am sure that Ministers will recognise that there is a need for this. No doubt their colleagues in the Department for Education are absolutely all over it. I was talking to a headteacher in the Minister’s own constituency recently—an excellent headteacher, in an excellent school, who is a personal friend. The horror with which headteachers regard the arrival of the GDPR is something to behold. Heaven knows, our school leaders and our teachers have enough to do. I call on Ministers to make their task, their lives, and their mission that bit easier by accepting the new clause.
Our schools handle large volumes of sensitive data about the children they educate. Anyone who has any involvement with the education system, either personally through their families, on their mobile phone apps, or in a professional capacity as constituency MPs, is very conscious of the huge responsibilities that school leaders have in handling that data properly and well, and in accordance with the law. As data controllers in their own right, schools and other organisations in the education system will need to ensure that they have adequate data-handling policies in place to comply with their legal obligations under the new law.
Work is going on already. The Department for Education has a programme of advice and education for school-leaders, which covers everything from blogs, a guidance video, speaking engagements, and work to encourage system suppliers to be proactive in helping schools to become GDPR-compliant. Research is also being undertaken with parents about model privacy notices that will help schools to make parents and pupils more aware of the data about children used in the sector. The Department for Education is also shaping a toolkit that will bring together various pieces of guidance and best practice to address the specific needs of those who process education data. In parallel, the Information Commissioner has consulted on guidance specifically addressing issues about the fair and lawful processing of children’s data. Everyone is very alive to the issue of protecting children and their data.
At this point, the Government want to support the work that is ongoing—already taking place—and the provisions on guidance that are already in the Bill. Our concern is that legislating for a code now could be seen as a reason for schools to wait and see, rather than continuing their preparations for the new law. But it may be that in due course the weight of argument swings in favour of a sector-specific code of practice. That can happen. It does not have to be in the Bill. It can happen because clause 128 provides that the Secretary of State may require the Information Commissioner to prepare additional codes of practice for the processing of personal data, and the commissioner can issue further guidance under her own steam, using her powers under article 57 of the GDPR, without needing any direction from the Secretary of State.
I hope that the ongoing work reassures the right hon. Gentleman and that he will withdraw the new clause at this stage.
I am reassured by that and I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
New Clause 17
Personal data ethics advisory board and ethics code of practice
‘(1) The Secretary of State must appoint an independent Personal Data Ethics Advisory Board (“the board”).
(2) The board’s functions, in relation to the processing of personal data to which the GDPR and this Act applies, are—
(a) to monitor further technical advances in the use and management of personal data and their implications for the rights of data subjects;
(b) to monitor the protection of the individual and collective rights and interests of data subjects in relation to their personal data;
(c) to ensure that trade-offs between the rights of data subjects and the use of management of personal data are made transparently, inclusively, and with accountability;
(d) to seek out good practices and learn from successes and failures in the use and management of personal data;
(e) to enhance the skills of data subjects and controllers in the use and management of personal data.
(3) The board must work with the Commissioner to prepare a data ethics code of practice for data controllers, which must—
(a) include a duty of care on the data controller and the processor to the data subject;
(b) provide best practice for data controllers and processors on measures, which in relation to the processing of personal data—
(i) reduce vulnerabilities and inequalities;
(ii) protect human rights;
(iii) increase the security of personal data; and
(iv) ensure that the access, use and sharing personal data is transparent, and the purposes of personal data processing are communicated clearly and accessibly to data subjects.
(4) The code must also include guidance in relation to the processing of personal data in the public interest and the substantial public interest.
(5) Where a data controller or processor does not follow the code under this section, the data controller or processor is subject to a fine to be determined by the Commissioner.
(6) The board must report annually to the Secretary of State.
(7) The report in subsection (6) may contain recommendations to the Secretary of State and the Commissioner relating to how they can improve the processing of personal data and the protection of data subjects’ rights by improving methods of—
(a) monitoring and evaluating the use and management of personal data;
(b) sharing best practice and setting standards for data controllers; and
(c) clarifying and enforcing data protection rules.
(8) The Secretary of State must lay the report made under subsection (6) before both Houses of Parliament.
(9) The Secretary of State must, no later than one year after the day on which this Act receives Royal Assent, lay before both Houses of Parliament draft regulations in relation to the functions of the Personal Data Ethics Advisory Board as listed in subsections (2), (3), (4), (6) and (7) of this section.
(10) Regulations under this section are subject to the affirmative resolution procedure.’—(Darren Jones.)
This new clause would establish a statutory basis for a Data Ethics Advisory Board.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
New clause 17 is in my name and that of my right hon. Friend the Member for Birmingham, Hodge Hill. I do not take it personally that my other hon. Friends have not signed up to it; that was probably my fault for not asking them to do so in advance.
The new clause would bring a statutory footing to the data and artificial intelligence ethics unit, which I am very pleased that the Government have now funded and established, through the spring statement, in the Minister’s Department. It comes off the back of conversations with the Information Commissioner in Select Committee about the differing roles of enforcing legislation and of having a public debate about what is right and wrong and what the boundaries are in this ever-changing space. The commissioner was very clear that we need to have that debate with the public, but that it is not for her to do it. The ICO is an enforcer of legislation. The commissioner has a lot on her plate and is challenged by her own resource as it is. She felt that the new unit in the Department would be a good place to have the debate about technology ethics, and I support that assertion.
With no disrespect to any colleagues, I do not think that the House of Commons, and perhaps even the Select Committees to a certain extent, necessarily has the time, energy or resource to get into the real detail of some of the technology ethics questions, nor to take them out to the public, who are the people we need to be having the debate with.
The new clause would therefore establish in law that monitoring, understanding and public debate obligation that I, the ICO and others agree ought to exist in the new data ethics unit, but make it clear that enforcement was reserved for the Information Commissioner. I tabled the new clause because, although I welcome the Government’s commitment to the data and AI ethics unit, I feel that there is potential for drift. The new clause would therefore put an anchor in the technology ethics requirement of the unit so that it understands and communicates the ethical issues and does not necessarily get sidetracked into other issues, although it may seek to do that on top of this anchor. However, I think this anchor needs to be placed.
Also, I recognise that the Minister and the Secretary of State supported the recommendation made previously under the Cameron Government and I welcome that, but of course, with an advisory group within the Department, it may be a future Minister’s whim that they no longer wish to be advised on these issues, or it may be the whim of the Treasury—with, potentially, budget cuts—that it no longer wishes to fund the people doing the work. I think that that is not good enough and that putting this provision in the Bill would give some security to the unit for the future.
I will refer to some of the comments made about the centre for data ethics and innovation, which I have been calling the data and AI ethics unit. When it was first discussed, in the autumn Budget of November 2017, the Chancellor of the Exchequer said that the unit would be established
“to enable and ensure safe, ethical and ground-breaking innovation in AI and data-driven technologies. This world-first advisory body will work with government, regulators and industry to lay the foundations for AI adoption”.
Although that is a positive message, it says to me that its job is to lay the foundations for AI adoption. I agree with that as an aim, but it does not mean that at its core is understanding and communicating the ethical challenges that we need to try to understand and legislate for.
I move on to some of the documents from the recruitment advertising for personnel to run the unit from January of this year, which said that the centre will be at the centre of plans to make the UK the best place in the world for AI businesses. Again, that is a positive statement, but one about AI business adoption in this country, not ethical requirements. It also said that the centre would advise on ethical and innovative uses of data-driven tech. Again, that is a positive statement, but I just do not think it is quite at the heart of understanding and communicating and having a debate about the ethics.
My concern is that while all this stuff is very positive, and I agree with the Government that we need to maintain our position as a world leader in artificial intelligence and that it is something we need to be very proud of—especially as we go through the regrettable process of leaving the European Union and the single market, we need to hold on to the strengths we have in the British economy—this week has shown that there is a need for an informed public debate on ethics. As no doubt all members of the Committee have read in my New Statesman article of today, one of the issues we have as the voice of our constituents in Parliament is that in order for our constituents to understand or take a view on what is right or wrong in this quickly developing space, we all need to understand it in the first place—to understand what is happening with our data and in the technology space, to understand what is being done with it and, having understood it, to then to take a view about it. The Cambridge Analytica scandal has been so newsworthy because the majority of people understandably had no idea that all this stuff was happening with their data. How we legislate for and set ethical frameworks must first come from a position of understanding.
That is why the new clause sets out that there should be an independent advisory board. The use of such boards is commonplace across Departments and I hope that would not be a contentious question. Subsection (2) talks about some of the things that that board should do. The Minister will note that the language I have used is quite careful in looking at how the board should monitor developments, monitor the protection of rights and look out for good practice. It does not seek to step on the toes of the Information Commissioner or the powers of the Government, but merely to understand, educate and inform.
The new clause goes on to suggest that the new board would work with the commissioner to put together a code of practice for data controllers. A code of practice with a technology ethics basis is important because it says to every data controller, regardless of what they do or what type of work they do, that we require ethical boundaries to be set and understood in the culture of what we do with big data analytics in this country. In working with the commissioner, this board would add great value to the way that we work with people’s personal data, by setting out that code of practice.
I hope that the new clause adds value to the work that the Minister’s Department is already doing. My hope is that by adding it to the Bill—albeit that current Parliaments cannot of course bind their successors and it could be legislated away in the future—it gives a solid grounding to the concept that we take technology ethical issues seriously, that we seek to understand them properly, not as politicians or as busy civil servants, but as experts who can be out with our stakeholders understanding the public policy consequences, and that we seek to have a proper debate with the public, working with enforcers such as the ICO to set, in this wild west, the boundaries of what is and is not acceptable. I commend the new clause to the Committee and hope that the Government will support it.
I thank the hon. Gentleman for raising this very important subject. He is absolutely right. Data analytics have the potential to transform whole sectors of society and the economy—law enforcement and healthcare to name but some. I agree with him that a public debate around the issues is required, and that is one of the reasons why the Government are creating the centre for data ethics and innovation, which he mentioned. The centre will advise the Government and regulators on how they can strengthen and improve the way that data and AI are governed, as well as supporting the innovative and ethical use of that data.
I thank the Minister for her co-operative words and for the invitation to be part of this developing area of public policy. Having already plugged my New Statesman article, I will plug a part of it, which is the news that, having worked with some of the all-party parliamentary groups, I am pleased that we will launch a commission on technology ethics with one of the Minister’s colleagues, whose constituency I cannot quite remember, I am afraid, so I cannot make reference to him. But he is excellent.
We look forward to working with industry, stakeholders and politicians on a cross-party basis, to get into the debate about technology ethics. I accept the Minister’s warm words about co-operating on this issue positively, so that hopefully the outcomes of this commission can perhaps help to influence the work of the unit, or centre, and the Government’s response to it.
I would like this new unit to be given a statutory basis, to show its importance. It is vital that it has clout across Government and across Departments, so that it is not just a positive thing when we have Ministers who are willing to take part in and listen to this debate and instead is something that will go on with successive Ministers, should the current Minister be promoted, and with future Governments, too. However, in return for the Minister’s warm words of co-operation, I am happy not to press the new clause to a vote today.
Very briefly, I declare an interest as the chair of the all-party parliamentary group on data analytics. This is a subject, of course, that is very dear to our hearts. I will just say that there is a great deal of common ground on it. I commend my hon. Friend the Member for Bristol North West for trying to put it into the Bill, because I, too, think it needs to be put on a statutory basis. However, I will just draw attention to a lot of the very good work that has been done by a whole range of people in bringing forward the new structures.
I will just say again that in general I think we are heaping a huge amount of responsibility on the Information Commissioner; frankly, we are now almost inviting her to save the world. She and her office will need help. So an additional body, with resources, is required.
The Royal Society and the British Academy have done a lot of work on this issue over the last few years. I will conclude by referring back to a comment made by the hon. Member for Gordon, because it is worth saying that the Royal Society and the British Academy state in the conclusions of their report:
“It is essential to have a framework that engenders trust and confidence, to give entrepreneurs and decision-makers the confidence to act now, and to realise the potential of new applications in a way that reflects societal preferences.”
That is exactly the kind of thing we are trying to achieve. This body is essential and it needs to be set up as quickly as possible.
I beg to ask leave to withdraw the new clause.
Clause, by leave, withdrawn.
New Clause 20
Automated number plate recognition (No. 2)
“(1) Vehicle registration marks captured by automated number plate recognition systems are personal data.
(2) The Secretary of State shall issue a code of practice in connection with the operation by the police of automated number plate recognition systems.
(3) Any code of practice under subsection (1) shall conform to section 67 of the Police and Criminal Evidence Act 1984.”—(Liam Byrne.)
This new clause requires the Secretary of State to issue a code of practice in connection with the operation by the police of automated number plate recognition systems, vehicle registration marks captured by which are to be considered personal data in line with the opinion of the Information Commissioner.
Brought up, and read the First time.
I beg to move, That the clause be read a Second time.
I will touch on this new clause only very briefly, because I hope the Minister will put my mind at rest with a simple answer. For some time, there has been concern that the way in which data collected by the police through automatic number plate recognition technology is not adequately ordered, organised or policed by a code of practice. A code of practice is probably required to put the police well and truly within the boundaries of the Police and Criminal Evidence Act 1984, the Data Protection Act 1998 and the Bill.
With this new clause, we are basically asking the Secretary of State to issue a code of practice in connection with the operation by the police of ANPR systems under subsection (1), and we ask that it conform to section 67 of the Police and Criminal Evidence Act 1984. I hope the Minister will just say that a code of practice is on the way so we can safely withdraw the new clause.
I hope Committee members have had the chance to see my response to the questions of the hon. Member for Sheffield, Heeley on Tuesday about ANPR, other aspects of surveillance and other types of law enforcement activity.
I assure the right hon. Member for Birmingham, Hodge Hill that ANPR data is personal data and is therefore caught by the provisions of the GDPR and the Bill. We recognise the need to ensure the use of ANPR is properly regulated. Indeed, ANPR systems are governed by not one but two existing codes of practice. The first is the code issued by the Information Commissioner, exercising her powers under section 51 of the Data Protection Act 1998. It is entitled “In the picture: A data protection code of practice for surveillance cameras and personal information”, and was published in June 2017. It is clear that it covers ANPR. It also refers to data protection impact assessments, which we debated last week. It clearly states that where the police and others use or intend to use an ANPR system, it is important that they
“undertake a privacy impact assessment to justify its use and show that its introduction is proportionate and necessary.”
The second code is brought under section 29 of the Protection of Freedoms Act 2012, which required the Secretary of State to issue a code of practice containing guidance about surveillance camera systems. The “Surveillance camera code of practice”, published in June 2013, already covers the use of ANPR systems by the police and others. It sets out 12 guiding principles for system operators. Privacy is very much a part of that. The Protection of Freedoms Act established the office of the Surveillance Camera Commissioner, who has a number of statutory functions in relation to the code, including keeping its operation under review.
In addition, a published memorandum of understanding between the Surveillance Camera Commissioner and the Information Commissioner sets out how they will work together. We also have the general public law principles of the Human Rights Act 1998 and the European convention on human rights. I hope that the two codes I have outlined, the Protection of Freedoms Act and the Human Rights Act reassure the right hon. Gentleman, and that he will withdraw his new clause.
I am indeed mollified. I beg to ask leave to withdraw the clause.
Clause, by leave, withdrawn.
New Clause 21
Targeted dissemination disclosure notice for third parties and others (No. 2)
“In Schedule 19B of the Political Parties, Elections and Referendums Act 2000 (Power to require disclosure), after paragraph 10 (documents in electronic form) insert—
10A (1) This paragraph applies to the following organisations and individuals—
(a) a recognised third party (within the meaning of Part 6);
(b) a permitted participant (within the meaning of Part 7);
(c) a regulated donee (within the meaning of Schedule 7);
(d) a regulated participant (within the meaning of Schedule 7A);
(e) a candidate at an election (other than a local government election in Scotland);
(f) the election agent for such a candidate;
(g) an organisation or a person notified under subsection 2 of this section;
(h) an organisation or individual formerly falling within any of paragraphs (a) to (g); or
(i) the treasurer, director, or another officer of an organisation to which this paragraph applies, or has been at any time in the period of five years ending with the day on which the notice is given.
(2) The Commission may under this paragraph issue at any time a targeted dissemination disclosure notice, requiring disclosure of any settings used to disseminate material which it believes were intended to have the effect, or were likely to have the effect, of influencing public opinion in any part of the United Kingdom, ahead of a specific election or referendum, where the platform for dissemination allows for targeting based on demographic or other information about individuals, including information gathered by information society services.
(3) This power shall not be available in respect of registered parties or their officers, save where they separately and independently fall into one or more of categories (a) to (i) of sub-paragraph (1).
(4) A person or organisation to whom such a targeted dissemination disclosure notice is given shall comply with it within such time as is specified in the notice.”
This new clause would amend the Political Parties, Elections and Referendums Act 2000 to allow the Electoral Commission to require disclosure of settings used to disseminate material where the platform for dissemination allows for targeting based on demographic or other information about individuals.—(Liam Byrne.)
Brought up, and read the First time.
With this it will be convenient to discuss new clause 22—Election material: personal data gathered by information society services—
“In section 143 of the Political Parties, Elections and Referendums Act 2000 (Details to appear on electoral material), leave out subsection (1)(b) and insert—
(b) in the case of any other material, including material disseminated through the use of personal data gathered by information society services, any requirements falling to be complied with in relation to the material by virtue of regulations under subsection (6) are complied with.”
This new clause would amend the Political Parties, Elections and Referendums Act 2000 to ensure that “any other material” clearly can be read to include election material disseminated through the use of personal data gathered by information society services.
I am happy to end on a note of cross-party consensus. We agree that we need to modernise our hopelessly outdated election laws. The news a couple of hours ago that the Information Commissioner’s application for a search warrant at Cambridge Analytica has been deferred—suspended until tomorrow—underlines the fact that the laws we have today for investigating malpractice that may impinge on the health of our democracy are hopelessly inadequate. The Information Commissioner declared to the world—for some reason on live television on Monday—that she was seeking a warrant to get into Cambridge Analytica’s office. Five days later there is still no search warrant issued by a court. Indeed, the court has adjourned the case until tomorrow.
I suspect that Cambridge Analytica has now had quite enough notice to do whatever it likes to the evidence that the Information Commissioner sought. This basket of clauses seeks to insert common-sense provisions to update the law in a way that will ensure that the data protection regime we put in place safeguards the health and wellbeing of our democracy. We need those because of what we now know about allegedly bad companies such as Cambridge Analytica, and because of what we absolutely know about bad countries such as Russia. We have been slow to wake up to the reality that, since 2012, Russia has been operating a new generation of active measures that seek to divide and rule its enemies.
There is no legal definition of hybrid war, so there is no concept of just war when it comes to hybrid war. There is no Geneva convention for hybrid war that defines what is good and what is bad and what is legal and illegal, but most legal scholars agree that a definition of hybrid war basically touches on a form of intervening against enemies in a way that is deniable and sometimes not traceable. It contains a basket of measures and includes the kind of tactics that we saw deployed in Crimea and Ukraine, which were of course perfected after the invasion of Georgia. We see it in the Baltics and now we see it not just in America but across western Europe as well.
Such a technique—a kind of warcraft of active measures—has a very long history in Russia. Major-General Kalugin, the KGB’s highest ranking defector, once described the approach as the “heart and soul” of Soviet intelligence. The challenge today is that that philosophy was comprehensively updated by General Gerasimov, the Russian Army’s chief of staff, and it came alongside a very different world view presented by President Putin after his re-election as President in 2012 and in his first state of the union address in 2013. It was in that address that President Putin attacked what he called a de-Christianised morally ambivalent west. He set out pretty categorically a foreign policy of contention rather than co-operation.
Since 2012, we have seen what is basically a history of tactical opportunism. A little bit unlike the Soviet era, what we now have are sometimes authorised groups, sometimes rogue groups, seeking openings where they can and putting in place disruptive measures. They are most dangerous when they target the messiness of digital democracy. Here we have a kind of perfection of what I have called in the past a dark social playbook—for example, hackers such as Cozy Bear or Fancy Bear attacked the Democratic National Committee during the American elections.
We also have a partnership with useful idiots such as WikiLeaks, an unholy alliance with what are politely called fake news sites such as Westmonster or indeed Russia Today or Breitbart, which spread hatred. We have a spillover into Twitter. Once a row is brewing on Twitter, we get troll farms such as the Internet Research Agency in St Petersburg kicking in. Half of the tweets about NATO in the Baltics are delivered by robo-trolls out of Russia. It is on an absolutely enormous scale. Once the row is cooking on Twitter, we get the import into Facebook groups. They are private groups and dark groups, and it is perfectly possible to switch on dark money behind those ads circulating the hate material to thousands and thousands if not millions.
We know that that was standard practice in the German and French elections. There is a risk—we do not know what the risk is because the Government will not launch an inquiry—that such activity was going on in the Brexit campaign. I anticipate that there will be more revelations about that this weekend. However, the challenge is that our election law is now hopelessly out of date.
I will be brief in answering some of the serious matters raised by the right hon. Gentleman. The Information Commissioner, as the data regulator, is investigating alleged abuses as part of a broader investigation into the use of personal data during political campaigns. I have said many times that the Bill will add significantly to the commissioner’s powers to conduct investigations, and I have confirmed that we keep an open mind and are considering actively whether further powers are needed in addition to those set out in the Bill.
The Electoral Commission is the regulator of political funding and spending. The commission seeks to bring transparency to our electoral system by enforcing rules on who can fund and how money can be spent, but new clause 21 is about sending the commission into a whole new field: that of personal data regulation. That field is rightly occupied by the Information Commissioner. We can debate whether she needs more powers in the light of the current situation at Cambridge Analytica, and as I have said we are reviewing the Bill.
While the Electoral Commission already has the power to require the disclosure of documents in relation to investigations under its current remit, new clause 21 would provide the commission with new powers to require the disclosure of the settings used to disseminate material. However, understanding how personal data is processed is outside the commission’s remit.
The right hon. Gentleman suggested that his amendment would help with transparency on who is seeking to influence elections, which is very much needed in the current climate. The Government take the security and integrity of democratic processes very seriously. It is absolutely unacceptable for any third country to interfere in our democratic elections or referendums.
On new clause 22, the rules on imprints in the Political Parties, Elections and Referendums Act 2000 are clear. The current rules apply to printed election material no matter how it is targeted. However, the Secretary of State has the power under section 143 to make regulations covering imprints on other types of material, including online material. New clause 22 would therefore not extend the type of online material covered by such regulations. We therefore believe the new clause is unnecessary. The law already includes printed election material disseminated through the use of personal data gathered by whatever means, and the Government will provide further clarity on extending those rules to online material in due course by consulting on making regulations under the power in section 143(6).
On that basis, I ask the right hon. Gentleman to withdraw his new clause.
That is a deeply disappointing answer. I was under the impression that the Secretary of State said in interviews today that he is open-minded about the UK version of the Honest Ads Act that we propose. That appears to be in some contrast to the answer that the Minister offered.
What this country has today is an Advertising Standards Authority that does not regulate political advertising; Ofcom, which does not regulate video when it is online; an Electoral Commission without the power to investigate digital campaigning; and an Information Commissioner who cannot get a search warrant. Worse, we have a Financial Conduct Authority that, because it does not have a data sharing gateway with the Electoral Commission, cannot share information about the financial background of companies that might have been laundering money going into political and referendum campaigns. The law is hopelessly inadequate. Through that great hole, our enemies are driving a coach and horses, which is having a huge impact on the health and wellbeing of our democracy.
That is not a day-to-day concern in Labour constituencies, but it is for the Conservative party. Voter Consultancy Ltd took out targeted dark social ads aimed at Conservative Members, accusing some of them of being Brexit mutineers when they had the temerity to vote for common sense in a vote on Brexit in this House. Voter Consultancy Ltd, for those who have not studied its financial records at Companies House, as I have, is a dormant company. It has no accounts filed. There is no cash flowing through the books. The question that provokes is: where does the money come from for the dark social ads attacking Conservative Members? We do not know. It is a matter of public concern that we should.
The law is out of date and needs to be updated. I will not press the matter to a vote this afternoon because I hope to return to it on Report, but I hope that between now and then the Minister and the Secretary of State reflect on the argument and talk to Mark Sedwill, the National Security Adviser, about why the national security strategy does not include an explicit objective to defend the integrity of our democracy. I hope that that change is made and that, as a consequence, further amendments will be tabled to ensure that our democracy is protected against the threats we know are out there.
I beg to ask leave to withdraw the motion.
Clause, by leave, withdrawn.
Question proposed, That the Chair do report the Bill, as amended, to the House.
On a point of order, Mr Streeter. I wanted to thank you, and Mr Hanson in his absence, as well as, in the House of Lords, my noble Friends Lord Ashton, Baroness Williams, Lord Keen, Baroness Chisholm and Lord Young, and the Opposition and Cross-Bench peers. I also thank the Under-Secretary of State for the Home Department, my hon. Friend the Member for Louth and Horncastle, and the Opposition Front Bench Members—the right hon. Member for Birmingham, Hodge Hill, with whom it has been a pleasure debating in the past two weeks, and the hon. Member for Sheffield, Heeley, who was not able to be in her place this afternoon.
I offer great thanks to both Whips. It was the first Bill Committee for my hon. Friend the Member for Selby and Ainsty in his capacity as Whip, and my first as Minister, and it has been a pleasure to work with him. I also thank the hon. Member for Ogmore. My hon. Friend the Under-Secretary and I are grateful to our Parliamentary Private Secretary, my hon. Friend the Member for Mid Worcestershire, who has worked terribly hard throughout the proceedings, as indeed have the Clerks, the Hansard writers, the Doorkeepers and the police. Without the officials of my Department and, indeed, the Home Office, we would all have been bereft, and I am most grateful to all the officials.
Question put and agreed to.
Bill, as amended, accordingly to be reported.