Data Protection Bill [ Lords ] (Second sitting) Debate

Full Debate: Read Full Debate
Department: Home Office
Tuesday 13th March 2018

(6 years, 1 month ago)

Public Bill Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Louise Haigh Portrait Louise Haigh
- Hansard - - - Excerpts

Given that the Minister asked so nicely, I will. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Amendments made: 87, in schedule 1, page 127, line 30, at end insert—

“( ) The reference in sub-paragraph (4)(b) to a data subject withholding consent does not include a data subject merely failing to respond to a request for consent.”

This amendment clarifies the intended effect of the safeguard in paragraph 15(4) of Schedule 1 (processing necessary for an insurance purpose).

Amendment 88, in schedule 1, page 127, line 39, at end insert—

“( ) is of data concerning health which relates to a data subject who is the parent, grandparent, great-grandparent or sibling of a member of the scheme,”.

This amendment provides that the condition in paragraph 16 of Schedule 1 (occupational pension schemes) can only be relied on in connection with the processing of data concerning health relating to certain relatives of a member of the scheme.

Amendment 89, in schedule 1, page 128, line 6, at end insert—

“( ) The reference in sub-paragraph (2)(b) to a data subject withholding consent does not include a data subject merely failing to respond to a request for consent.”

This amendment clarifies the intended effect of the safeguard in paragraph 16(2) of Schedule 1 (processing necessary for determinations in connection with occupational pension schemes).

Amendment 90, in schedule 1, page 131, line 14, at end insert—

“( ) If the processing consists of the disclosure of personal data to a body or association described in sub-paragraph (1)(a), or is carried out in preparation for such disclosure, the condition in sub-paragraph (1) is met even if, when the processing is carried out, the controller does not have an appropriate policy document in place (see paragraph 5 of this Schedule).”

This amendment provides that when processing consists of the disclosure of personal data to a body or association that is responsible for eliminating doping in sport, or is carried out in preparation for such disclosure, the condition in paragraph 22 of Part 2 of Schedule 1 (anti-doping in sport) is met even if the controller does not have an appropriate policy document in place when the processing is carried out.

Amendment 91, in schedule 1, page 133, line 17, leave out from “interest” to end of line 21.—(Margot James.)

This amendment removes provisions from paragraph 31 of Schedule 1 (extension of conditions in Part 2 of Schedule 1 referring to substantial public interest) which are unnecessary because they impose requirements which are already imposed by paragraph 5 of Schedule 1.

Margot James Portrait The Minister of State, Department for Digital, Culture, Media and Sport (Margot James)
- Hansard - -

I beg to move amendment 92, page 134, line 18 [Schedule 1], leave out “on the day” and insert “when”.

This amendment is consequential on Amendment 71.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Government amendments 107, 108, 111, 113, 114, 21, 29 to 40, 43 to 46, 118 to 121, 48, 49, 53, 55, 56, 123 to 125, 59 and 71.

Margot James Portrait Margot James
- Hansard - -

Following engagement with local government stakeholders, we have recognised that the maximum time period permitted for responses to the subject access request set out in parts 3 and 4 of the Data Protection Bill subtly differs from that permitted under the GDPR and part 2 of the Bill. That is because the GDPR and, by extension, part 2 rely on European rules for calculating time periods, whereas parts 3 and 4 implicitly rely on a more usual domestic approach. European law, which applies to requests under part 2, says that when one is considering a time period in days, the day on which the request is received is discounted from the calculation of that time period. In contrast, the usual position under UK law, which applies to requests under parts 3 and 4 of the Bill, is that that same seven-day period to respond would begin on the day on which the request was received. In a data protection context, that has the effect of providing those controllers responding to requests under parts 3 and 4 with a time period that is one day shorter in which to respond.

To provide consistency across the Bill, we have decided to include a Bill-wide provision that applies the European approach to all time periods throughout the Bill, thus ensuring consistency with the directly applicable GDPR. Having a uniform approach to time periods is particularly helpful for bodies with law enforcement functions, which will process personal data under different regimes under the Bill. Without these amendments, different time periods would apply, depending on which regime they were processing under. Ensuring consistency for calculating time periods will also assist the information commissioner with her investigatory activities and enforcement powers, for example by avoiding the confusion and potential disputes that could arise relating to her notices or requests for information.

Amendment 71 provides for a number of exemptions to the European approach where deviating from our standard approach to time periods would be inappropriate. For example, where the time period refers to the process of parliamentary approval of secondary legislation, it would clearly not be appropriate to deviate from usual parliamentary time periods. The unfortunate number of amendments in this group comes from the need to modify existing language on time periods, currently worded for compliance with the usual UK approach, so that it applies the approach of the EU rules instead. I hope that this has provided the Committee with sufficient detail on the reasons for tabling this group of amendments.

Amendment 92 agreed to.

Question proposed, That the schedule, as amended, be the First schedule to the Bill.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

We had a useful debate this morning about the whys and wherefores of whether the article 8 right to privacy should be incorporated into the Bill. Although we were disappointed by the Minister’s reply, what I thought was useful in the remarks she made was a general appreciation of the importance of strong data rights if the UK is to become a country with a strong environment of trust within which a world of digital trade can flourish.

I will briefly alert the Minister to a debate we want to have on Report. The reality is that we feel schedule 1 is narrowly drawn. It is an opportunity that has been missed, and it is an opportunity for the Minister to come back on Report with a much more ambitious set of data rights for what will be a digital century. When we look around the world at the most advanced digital societies, we can see that a strong regime of data rights is common to them all.

I was recently in Estonia, which I hope the Minister will have a chance to visit if she has not done so already. Estonia likes to boast of its record as the world’s most advanced digital society; it is a place where 99% of prescriptions are issued online, 95% of taxes are paid online and indeed a third of votes are cast online. It is a country where the free and open right to internet access is seen as an important social good, and a good example of a country that has really embraced the digital revolution and translated that ambition into a set of strong rights.

The Government are not averse to signing declaratory statements of rights that they then interpret into law. They are a signatory to the UN universal declaration of human rights and the UN convention on the rights of the child; the Human Rights Act 1998 is still in force—I have not yet heard of plans to repeal it—and of course the Equality Act 2010 was passed with cross-party support. However, those old statements of rights, which date back to 1215, were basically to correct and guard against dangerous imbalances of power. Things have moved on since 1215 and the worries that the barons had about King John. We are no longer as concerned as people were in 1215 about taking all the fish weirs out of the Thames, for example.

--- Later in debate ---
None Portrait The Chair
- Hansard -

To make matters clear to hon. Members and in particular those who are new to the Committee, the right hon. Member for Birmingham, Hodge Hill tabled a number of amendments—171 to 175 and 177 to 178—that were not selected because they were tabled only yesterday. We need to have several days’ notice before selection can be considered. Had they been tabled earlier, we could have debated and voted on those amendments now. I have given the right hon. Gentleman leeway to widen his arguments about schedule 1, and it is up to him whether he wishes to table those amendments on Report. He is perfectly in order to do so. The debate today is on schedule 1, and the points that the right hon. Gentleman has made in relation to potential amendments are a heads-up for the future or for the Minister to respond to at this point.

Margot James Portrait Margot James
- Hansard - -

The right hon. Member for Birmingham, Hodge Hill covered a lot of important ground. He mentioned the digital charter. We are bringing forward the digital charter and we do not intend for it to be set in stone. We recognise that this is a fast-changing environment and so it is deliberately something that will evolve over time. We both share the concerns that he expressed with regard to fake news and the rights and protections needed for children and young people who, as he says, make up a third of internet users. We will address many of the things he highlighted as part of our internet safety strategy, and I look forward to debating them further with him on Report.

To add to what we have already discussed under schedule 1, article 9 of the GDPR limits the processing of special categories of data. Those special categories are listed in article 9(1) and include personal data revealing racial or ethnic origin, health, political opinions and religious beliefs. Some of the circumstances in which article 9 says that special category data can be processed have direct effect, but others require the UK to make related provision.

Clause 10 introduces schedule 1 to the Bill, which sets out in detail how the Bill intends to use the derogations in article 9 and the derogation in article 10 relating to criminal convictions data to permit particular processing activities. To ensure that the Bill is future-proof, clause 10 includes a delegated power to update schedule 1 using secondary legislation. Many of the conditions substantively replicate existing processing conditions in the 1998 Act and hon. Members may wish to refer to annexe B to the explanatory notes for a more detailed analysis on that point.

Darren Jones Portrait Darren Jones
- Hansard - - - Excerpts

I want to make one point about schedule 1. Amendment 9, which was made this morning, allows democratic engagement to be a purpose under article 6(1)(e) of the GDPR—namely, that consent is not required for the processing of data for public interest or the exercising of official authority and the purposes of democratic engagement. I wonder whether the definitions of political parties and politicians under schedule 1 could be used to restrict that amendment, so that organisations other than political parties and politicians are not able to process data in the public interest for democratic engagement without consent. For example, if Leave.EU or Open Britain wanted to process our personal data, they ought to do so with consent, not using the same public interest for democratic engagement purposes as politicians or parties.

Margot James Portrait Margot James
- Hansard - -

I understand the hon. Gentleman’s concerns. The GDPR requires data controls to have a legal basis laid down in law, which can take the form, for example, of a statutory power or duty, or a common-law power. Any organisation that does not have such legal basis would have to rely on one of the other processing conditions in article 6. With regard to the amendment that was agreed to this morning, we think that further restricting clause 8 might risk excluding bodies with a lawful basis for processing. However, the hon. Gentleman is free to raise the issue again on Report.

Question put and agreed to.

Schedule 1, as amended, accordingly agreed to.

Clauses 11 to 13 ordered to stand part of the Bill.

Clause 14

Automated decision-making authorised by law: safeguards

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I beg to move amendment 153, in clause 14, page 7, line 30, at end insert—

“(1A) A decision that engages an individual’s rights under the Human Rights Act 1998 does not fall within Article 22(2)(b) of the GDPR (exception from prohibition on taking significant decisions based solely on automated processing for decisions that are authorised by law and subject to safeguards for the data subject’s rights, freedoms and legitimate interests).”

This amendment would clarify that the exemption from prohibition on taking significant decisions based solely on automated processing must apply to purely automated decisions that engage an individual’s human rights.

--- Later in debate ---
Brendan O'Hara Portrait Brendan O'Hara (Argyll and Bute) (SNP)
- Hansard - - - Excerpts

I will speak to amendments 130, 133 and 135, which appear in my name and that of my hon. Friend the Member for Cumbernauld, Kilsyth and Kirkintilloch East. Our amendments seek to provide protection for individuals who are subject to purely automated decision making, specifically where we believe that it could have an adverse impact on their fundamental rights. The amendments would require that where human rights are or possibly could be impacted by automated decisions, ultimately there are always human decision makers. The amendments would instil that vital protection of human rights with regard to the general processing of personal data.

The amendments seek to clarify the meaning of a decision that is based solely on automated processing, which is a decision that lacks meaningful human input. That reflects the intent of the GDPR, and provides clarification that purely administrative human approval of an automated decision does not make that decision a human one. It is simply not enough for human beings to process the information in a purely administrative fashion, but to have absolutely no oversight or accountability for the decision that they process. We strongly believe that automated decision making without human intervention should be subject to strict limitations to ensure fairness, transparency and accountability, and to safeguard against discrimination. As it stands, there are insufficient safeguards in the Bill.

As the right hon. Member for Birmingham, Hodge Hill said, we are not talking about every automated decision. We are not talking about a tech company or an online retailer that suggests alternatives that someone may like based on the last book they bought or the last song they downloaded. It is about decisions that can be made without human oversight that will or may well have long-term, serious consequences on an individual’s health, financial status, employment or legal status. All too often, I fear that automated decisions involve an opaque, unaccountable process that uses algorithms that are neither as benign nor as objective as we had hoped they would be, or indeed, as we thought they were when we first encountered them.

We are particularly concerned about elements of the Bill that allow law enforcement agencies to make purely automated decisions. That is fraught with danger and at odds with the Data Protection Act 1998, as well as article 22 of the GDPR, which states:

“The data subject shall have the right not to be subject to a decision based solely on automated processing”.

Although there are provisions in the GDPR for EU member states to opt out of that, the opt-out does not apply if the data subject’s rights, freedoms or legitimate interests are undermined.

I urge the Government to look again at the parts of the Bill about automated decision making, to ensure that when it is carried out, a human being will have to decide whether it is reasonable and appropriate to continue on that course. That human intervention will provide transparency and capability, and it will ensure that the state does not infringe on an individual’s freedoms—those fundamental rights of liberty and privacy—which are often subjective. Because they are subjective, they are beyond the scope of an algorithm.

There are serious human rights, accountability and transparency issues around fully automated decision making as the Bill stands. Amendment 130 says that any human involvement has to be “meaningful”. We define meaningful human oversight as being significant, of consequence and purposeful. As I have said, that is far beyond the scope of an algorithm. If an individual’s rights are to be scrutinised and possibly fundamentally affected, it is an issue of basic fairness that the decision is made, or at least overseen, by a sentient being. I hope the Government accept the amendments in the faith in which they were tabled.

Margot James Portrait Margot James
- Hansard - -

The amendments relate to automated decision making under the GDPR and the Bill. It is a broad category, which includes everything from trivial things such as music playlists, as mentioned by the hon. Member for Argyll and Bute, and quotes for home insurance, to the potentially more serious issues outlined by the right hon. Member for Birmingham, Hodge Hill of recruitment, healthcare and policing cases where existing prejudices could be reinforced. We are establishing a centre, the office for artificial intelligence and data ethics, and are mindful of these important issues. We certainly do not dismiss them whatsoever.

Article 22 of the GDPR provides a right not to be subject to a decision based solely on automatic processing of data that results in legal or similarly significant effects on the data subject. As is set out in article 22(2)(b), that right does not apply if the decision is authorised by law, so long as the data subject’s rights, freedoms and legitimate interests are safeguarded.

The right hon. Member for Birmingham, Hodge Hill, mentioned those safeguards, but I attribute far greater meaning to them than he implied in his speech. The safeguards embed transparency, accountability and a right to request that the decision be retaken, and for the data subject to be notified should a decision be made solely through artificial intelligence.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

The Minister must realise that she is risking an explosion in the number of decisions that have to be taken to Government agencies or private sector companies for review. The justice system is already under tremendous pressure. The tribunal system is already at breaking point. The idea that we overload it is pretty optimistic. On facial recognition at public events, for example, it would be possible under the provisions that she is proposing for the police to use facial recognition technology automatically to process those decisions and, through a computer, to have spot interventions ordered to police on the ground. The only way to stop that would be to have an ex post facto review, but that would be an enormous task.

Margot James Portrait Margot James
- Hansard - -

The right hon. Gentleman should be aware that just because something is possible, it does not mean that it is automatically translated into use. His example of facial recognition and what the police could do with that technology would be subject to controls within the police and to scrutiny from outside.

Louise Haigh Portrait Louise Haigh
- Hansard - - - Excerpts

The case that my right hon. Friend raises is certainly not hypothetical. The Metropolitan police have been trialling facial recognition scanning at the Notting Hill carnival for the last three years with apparently no legal base and very little oversight. We will move on to those issues in the Bill. That is exactly why the amendments are crucial in holding law enforcement agencies to account.

Margot James Portrait Margot James
- Hansard - -

As the hon. Lady says, the police are trialling those things. I rest my case—they have not put them into widespread practice as yet.

Returning to the GDPR, we have translated the GDPR protections into law through the Bill. As I said, the data subject has the right to request that the decision be retaken with the involvement of a sentient individual. That will dovetail with other requirements. By contrast, the amendments are designed to prevent any automated decision-making from being undertaken under article 22(2)(b) if it engages the rights of the data subject under the Human Rights Act 1998.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

Will the Minister explain to the Committee how a decision to stop and search based on an automated decision can be retaken? Once the person has been stopped and searched, how can that activity be undone?

Margot James Portrait Margot James
- Hansard - -

I am not going to get into too much detail. The hon. Member for Sheffield, Heeley mentioned an area and I said that it was just a trial. She said that facial recognition was being piloted. I do not dispute that certain things cannot be undone. Similar amendments were tabled in the other place. As my noble Friend Lord Ashton said there, they would have meant that practically all automated decisions under the relevant sections were prohibited, since it would be possible to argue that any decision based on automatic decision making at the very least engaged the data subject’s right to have their private life respected under article 8 of the European convention on human rights, even if it was entirely lawful under the Act.

--- Later in debate ---
Brendan O'Hara Portrait Brendan O'Hara
- Hansard - - - Excerpts

In that case, I will not press the amendment now.

Margot James Portrait Margot James
- Hansard - -

I beg to move Government amendment 10, in clause 14, page 8, line 4, leave out “21 days” and insert “1 month”.

Clause 14(4)(b) provides that where a controller notifies a data subject under Clause 14(4)(a) that the controller has taken a “qualifying significant decision” in relation to the data subject based solely on automated processing, the data subject has 21 days to request the controller to reconsider or take a new decision not based solely on automated processing. This amendment extends that period to one month.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 11, 12, 23, 24, 27, 28, 41 and 42.

Margot James Portrait Margot James
- Hansard - -

Amendments 10, 11 and 12 relate to clause 14, which requires a data controller to notify a data subject of a decision based solely on automatic processing as soon as is reasonably practicable. The data subject may then request that the data controller reconsider such a decision and take a new decision not based solely on automated processing.

The purpose of the amendments is to bring clause 14 into alignment with the directly applicable time limits in article 12 of the GDPR, thereby ensuring that both data subjects and data controllers have easily understandable rights and obligations. Those include giving the data subject longer to request that a decision be reconsidered, requiring that the controller action the request without undue delay and permitting an extension of up to two months where necessary.

Furthermore, to ensure that there is consistency across the different regimes in the Bill—not just between the Bill and the GDPR—amendments 23, 24, 41 and 42 extend the time limit provisions for making and responding to requests in the other regimes in the Bill. That is for the simple reason that it would not be right to have a data protection framework that applies one set of time limits to one request and a different set of time limits to another.

In a similar vein, amendments 27 and 28 amend part 3 of the Bill, concerning law enforcement processing, to ensure that controllers can charge for manifestly unfounded or excessive requests for retaking a decision, as is permitted under article 12 of the law enforcement directive. To prevent abuse, amendment 28 provides that it is for the controller to be able to show that the request was manifestly unfounded or excessive.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

It would be useful if the Minister could say a little more about the safeguards around the controllers charging reasonable fees for dealing with requests.

It is quite easy to envisage situations where algorithms take decisions. We have some ex post facto review; a citizen seeks to overturn the decision; the citizen thinks they are acting reasonably but the commercial interest of the company that has taken and automated the decision means that it wants to create disincentives for that rigmarole to unfold. That creates the risk of unequal access to justice in these decisions.

If the Minister is not prepared to countenance the sensible safeguards that we have proposed, she must say how she will guard against another threat to access to justice.

Margot James Portrait Margot James
- Hansard - -

The right hon. Gentleman asks a reasonable question. I did not mention that data subjects have the right of complaint to the Information Commissioner if the provisions are being abused. I also did not mention another important safeguard, which is that it is for the data controller to show that the request is manifestly unfounded or excessive. So the burden of proof is on the data controller and the data subject has the right of involving the Information Commissioner, if he or she contests the judgment taken in this context, concerning unfounded or excessive requests in the opinion of the data controller. I hope that satisfies the right hon. Gentleman.

Amendment 10 agreed to.

Amendments made: 11, in clause 14, page 8, leave out line 10 and insert “within the period described in Article 12(3) of the GDPR—”

This amendment removes provision from Clause 14(5) dealing with the time by which a controller has to respond to a data subject’s request under Clause 14(4)(b) and replaces it with a requirement for the controller to respond within the time periods set out in Article 12(3) of the GDPR, which is directly applicable.

Amendment 12, in clause 14, page 8, line 16, at end insert—

‘(5A) In connection with this section, a controller has the powers and obligations under Article 12 of the GDPR (transparency, procedure for extending time for acting on request, fees, manifestly unfounded or excessive requests etc) that apply in connection with Article 22 of the GDPR.” —(Margot James.)

This amendment inserts a signpost to Article 12 of the GDPR which is directly applicable and which confers powers and places obligations on controllers to whom Clause 14 applies.

Clause 14, as amended, ordered to stand part of the Bill.

Clause 15

Exemptions etc.

Margot James Portrait Margot James
- Hansard - -

I beg to move amendment 13, in clause 15, page 8, line 31, after “21” insert “and 34”

This amendment is consequential on Amendment 94.

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss Government amendments 14, 93 to 106, 109, 110 and 112.

Margot James Portrait Margot James
- Hansard - -

Schedule 2 allows for particular rights or obligations contained in the GDPR to be disapplied in particular circumstances, where giving effect to that right or obligation would lead to a perverse outcome. To do that, it makes use of a number of derogations in the GDPR, including articles 6(3) and 23(1).

Amendments 93, 95 and 109 permit article 19 of the GDPR to be disapplied for the purposes in parts 1, 2 and 5 of schedule 2.

When a data controller corrects or deletes personal data following a request from a data subject, article 19 of the GDPR requires them to inform all persons to whom the personal data has been disclosed. Additionally, if requested, the data controller must inform the data subject about those persons to whom the data has been disclosed. Following the introduction of the Bill, we have had further representations from a range of stakeholders, including the banking industry, regulators and the media sector, about the problems that article 19 might create in very particular circumstances.

The amendments will ensure that, for example, where a bank may have shared personal data about one of its customers with the National Crime Agency because of a suspected fraud, it will not have to tell the data subject about that disclosure when the customer changes their address with the bank. That will ensure that the data subject is not tipped off about the suspected fraud investigation.

Several amendments in the group are designed to ensure that a valuable provision of the GDPR—article 34—does not have unintended consequences for controllers who do the right thing by seeking to prevent or detect crime, assist with the assessment or collection of tax or uncover abuses in our society. Article 34 requires data controllers to inform a data subject if there has been a data breach that is likely to result in a high risk to the rights and freedoms of an individual. In normal operation, this is an important article, which we hope will prompt a step change in the way organisations think about cyber-security.

However, article 23(1) enables member states to create laws to restrict the scope of the obligations and rights for which article 34 provides in the minority of cases where it conflicts with other important objectives of general public interest. The amendments seek to do that in the Bill. Amendment 94 responds to the concerns of the finance sector that compliance with article 34 may result in persons under investigation for financial crime being tipped off. Amendment 110 serves a similar purpose for media organisations.

Article 85(2) creates scope for member states to provide exemptions from chapter 4 of the GDPR, which includes article 34, if they are necessary to reconcile the right to the protection of personal data with the freedom of expression. The amendment intends to ensure that processing data for a special purpose that is in the public interest is not prejudiced—for example, by a controller having to notify the data subject of a breach in relation to pre-publication undercover footage. Importantly, data controllers will still be required, for the first time, to report a breach to the Information Commissioner under article 33 of the GDPR. That will ensure that she is well placed to take all the necessary steps to ensure data subjects’ rights are respected, including by monitoring compliance with these new exemptions.

On the more general question of who can make use of the exemptions in schedule 2 and when, amendment 96 broadens the exemption in paragraph 7 of the schedule, which relates to the protection of members of the public. As drafted, the exemption applies to personal data processed for the purposes of discharging a function that is designed to protect members of the public against dishonesty, malpractice or incompetence by persons who carry out activities that bring them into contact with members of the public. We have identified an issue with that wording: a number of public office holders, including police staff, do not carry out activities that necessarily bring them into contact with members of the public. Amendment 96 broadens the scope of the exemption to include processing in relation to individuals who work for those organisations in a behind-the-scenes capacity.

We have also had representations from several regulators on the need to make additional provisions to protect the integrity of their activities. Amendment 97 provides the UK’s Comptroller and Auditor General, and their counterpart in each of the devolved Administrations, with an exemption from certain GDPR provisions where these are likely to prejudice their statutory functions. That will prevent certain individuals who suspect they may be under scrutiny from trying to use their rights under the GDPR, such as article 15 (confirmation of processing) as a way of confirming that their data is being processed, or from using article 17 (right to erasure) and article 18 (restriction of processing) to undermine the effectiveness of an audit.

--- Later in debate ---
None Portrait The Chair
- Hansard -

I have just had a request to remove jackets, because of the warm temperature in the room. I give my permission to do so. I call the Minister.

Margot James Portrait Margot James
- Hansard - -

Thank you, Mr Hanson. I agree with the tribute paid by the right hon. Member for Birmingham, Hodge Hill to the custodians of some of the most wonderful archives in the world. I will comment on his proposals with regard to such archives shortly, but I hope that recent debates have left no doubt in hon. Members’ minds that the Government are absolutely committed to preserving the freedom of the press, and maintaining the balance between privacy and freedom of expression in our existing law, which has served us well for so many years.

As set out in the Bill, media organisations can already process data for journalistic purposes, which includes media archiving. As such, we believe that amendment 170 is unnecessary and could be unhelpful. I agree with the right hon. Gentleman that it is crucial that the media can process data and maintain media archives. In the House of Lords, my noble Friend Lord Black of Brentwood explained very well the value of media archives. He said:

“Those records are not just the ‘first draft of history’; they often now comprise the only record of significant events, which will be essential to historians and others in future, and they must be protected.”—[Official Report, House of Lords, 10 October 2017; Vol. 785, c. 175.]

However, recital 153 indicates that processing for special purposes includes news archiving and press libraries. Paragraph 24 of schedule 2 sets out the range of derogations that apply to processing for journalistic purposes. That includes, for example, exemption from complying with requests for the right to be forgotten. That means that where the exemption applies, data subjects would not have grounds to request that data about them be deleted. It is irrelevant whether the data causes substantial damage or distress.

However, if media organisations are archiving data for other purposes—for example, in connection with subscriber data—it is only right that they are subjected to the safeguards set out in article 89(1), and the Bill provides for that accordingly. For that reason, I hope that the right hon. Gentleman agrees to reconsider his approach and withdraw his amendment.

Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

I am happy to withdraw the amendment, although I would say to the Minister that the helpful words we have heard this afternoon will not go far enough to satisfy the objections that we heard from organisations. We reserve the right to come back to this matter on Report. We will obviously consult the organisations that helped us to draft the amendment, and I urge her to do the same. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Schedule 2, as amended, agreed to.

Schedule 3

Exemptions etc from the GDPR: health, social work, education and child abuse data

Amendments made: 111, in schedule 3, page 160, line 21, leave out

“with the day on which”

and insert “when”.

This amendment is consequential on Amendment 71.

Amendment 112, in schedule 3, page 162, line 3, leave out paragraph 16 and insert—

“16 (1) This paragraph applies to a record of information which—

(a) is processed by or on behalf of the Board of Governors, proprietor or trustees of, or a teacher at, a school in Northern Ireland specified in sub-paragraph (3),

(b) relates to an individual who is or has been a pupil at the school, and

(c) originated from, or was supplied by or on behalf of, any of the persons specified in sub-paragraph (4).

(2) But this paragraph does not apply to information which is processed by a teacher solely for the teacher’s own use.

(3) The schools referred to in sub-paragraph (1)(a) are—

(a) a grant-aided school;

(b) an independent school.

(4) The persons referred to in sub-paragraph (1)(c) are—

(a) a teacher at the school;

(b) an employee of the Education Authority, other than a teacher at the school;

(c) an employee of the Council for Catholic Maintained Schools, other than a teacher at the school;

(d) the pupil to whom the record relates;

(e) a parent, as defined by Article 2(2) of the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).

(5) In this paragraph, “grant-aided school”, “independent school”, “proprietor” and “trustees” have the same meaning as in the Education and Libraries (Northern Ireland) Order 1986 (S.I. 1986/594 (N.I. 3)).”

This amendment expands the types of records that are “educational records” for the purposes of Part 4 of Schedule 3.

Amendment 113, in schedule 3, page 164, line 7, leave out

“with the day on which”

and insert “when”.—(Margot James.)

This amendment is consequential on Amendment 71.

Schedule 3, as amended, agreed to.

Schedule 4 agreed to.

Clause 16

Power to make further exemptions etc by regulations

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Liam Byrne Portrait Liam Byrne
- Hansard - - - Excerpts

We agree that the clause offers Ministers a rather sweeping power to introduce new regulations. Over the course of what has been quite a short day in Committee we have heard many reasons to be alarmed about equipping Ministers with such sweeping powers. We proposed an amendment to remove the clause, which I think was not selected because we have this stand part debate. What we need to hear from the Minister are some pretty good arguments as to why Ministers should be given unfettered power to introduce such regulations without the effective scrutiny and oversight of right hon. and hon. Members in this House.

Margot James Portrait Margot James
- Hansard - -

I am glad that the right hon. Gentleman feels we have had a short day in Committee. In answer to his questions and those of the hon. Gentleman, the order making powers in clauses 16 and 113 allow the Secretary of State to keep the list of exemptions in schedules 2 to 4 and 11 up to date. As I mentioned when we discussed order making powers in relation to clause 10 and schedule 1, we carefully reviewed the use of such powers in the Bill following recommendations from the Delegated Powers and Regulatory Reform Committee. We think an appropriate balance has now been struck. It might be helpful if I explain the reasons for our thinking.

Clause 16 includes order making powers to ensure that the Secretary of State can update from time to time the particular circumstances in which data subjects’ rights can be disapplied. That might be necessary if, for example, the functions of a regulator are expanded and exemptions are required to ensure that those new functions cannot be prejudiced by a data subject exercising his or her right to object to the processing.

We believe it is very important that the power to update the schedules is retained. Several of the provisions in schedules 2 to 4 did not appear in the Data Protection Act 1998 and have been added to the Bill to address specific requirements that have arisen over the last 20 years.

For example, the regulatory landscape has changed dramatically since the 1998 Act. Organisations such as the Bank of England, the Financial Conduct Authority and the National Audit Office have taken on a far broader range of regulatory functions, and that is reflected in the various amendments we have tabled to paragraphs 7 to 9 of schedule 2, to provide for a broader range of exemptions. No doubt, there will be further changes to the regulatory landscape in the years to come. Of course, other exemptions in schedule 2 have been carried over from the 1998 Act, or indeed from secondary legislation made under that Act, with little change. That does not mean, however, that they will never need to be amended in the future. Provisions made under the 1998 Act could be amended via secondary legislation, so it would seem remiss not to afford ourselves that same degree of flexibility now. If we have to wait for primary legislation to make any changes, it could result in a delay of months or possibly years to narrow or widen an extension, even where a clear deficiency had been identified. We cannot predict the future, and it is important that we retain the power to update the schedules quickly when the need arises.

Importantly, any regulations made under either clause would be subject to the affirmative resolution procedure. There would be considerable parliamentary oversight before any changes could be made using these powers. Clause 179 requires the Secretary of State to consult with the Information Commissioner and other interested parties that he considers appropriate before any changes are made.

I hope that that reassures Members that we have considered the issue carefully. I commend clause 16 to the Committee.

Question put, That the clause stand part of the Bill.

The Committee proceeded to a Division.

--- Later in debate ---
Accreditation of certification providers
Margot James Portrait Margot James
- Hansard - -

I beg to move amendment 15, in clause 17, page 10, line 16, leave out “authority” and insert “body”.

This amendment corrects the reference in Clause 17(7) to the “national accreditation authority” by amending it to refer to the “national accreditation body”, which is defined in Clause 17(8).

Clause 17 relates to the certification of data controllers. This is a relatively new concept and will take time to bed in, but it could also be a significant step forward in ensuring that data subjects can have confidence in controllers and processors and, perhaps even more important, that controllers and processors can have confidence in each other. It is likely to be particularly relevant in the context of cloud computing and other business-to-business platforms where individual audits are often not feasible in practice.

Before they can audit controllers, certification bodies must be accredited, either by the Information Commissioner or by the national accreditation body, UKAS. Clause 17 and schedule 5 set out how the process will be managed. Unfortunately, there is a typographical error in clause 17. It refers erroneously to the “national accreditation authority” in subsection (7), when it should refer to the “national accreditation body”. Amendment 15 corrects that error.

Amendment 15 agreed to.

Clause 17, as amended, ordered to stand part of the Bill.

Schedule 5

Accreditation of certification providers: reviews and appeals

Amendment made: 114, in schedule 5, page 170, line 21, leave out “In this paragraph” and insert—

“Meaning of “working day”

7 In this Schedule”

This amendment applies the definition of “working day” for the purposes of the whole of Schedule 5. There are references to “working days” in paragraphs 5(2) and 6(3) of that Schedule.(Margot James.)

Schedule 5, as amended, agreed to.

Clause 18 ordered to stand part of the Bill.

Clause 19

Processing for archiving, research and statistical purposes: safeguards

Amendment made: 16, in clause 19, page 12, line 2, leave out “(d)” and insert “(e)” —(Margot James.)

This amendment amends the definition of “relevant NHS body” in this Clause by adding special health and social care agencies established under Article 3 of the Health and Personal Social Services (Special Agencies) (Northern Ireland) Order 1990 (which fall within paragraph (e) of section 1(5) of the Health and Social Care (Reform) Act (Northern Ireland) 2009).

Clause 19, as amended, ordered to stand part of the Bill.

Clauses 20 to 22 ordered to stand part of the Bill.

Ordered, That further consideration be now adjourned.—(Nigel Adams.)