Data Protection and Digital Information (No. 2) Bill Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Data Protection and Digital Information (No. 2) Bill

Baroness Winterton of Doncaster Excerpts
2nd reading
Monday 17th April 2023

(1 year, 8 months ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Jim Shannon Portrait Jim Shannon (Strangford) (DUP)
- View Speech - Hansard - - - Excerpts

It is a pleasure to add some comments and make a contribution, and also to have heard all the right hon. and hon. Members’ speeches as I have sat here tonight. There will not be any votes on the Bill, I understand, but if there had been, my party would have supported the Government, because I think the intention of the Minister and the Government is to try to find a correct way forward. I hope that some of the tweaking that is perhaps needed can happen in a positive way that can address such issues. It is always good to speak in any debate in this House, but this is the first one after the recess, and I am indeed very pleased to be a part of any debates in the House. I have spoken on data protection and its importance in the House before, and I again wish to make a contribution, specifically on medical records and protection of health data with regard to GP surgeries. I hope to address that with some questions for the Minister at the end.

Realistically, data protection is all around us. I know all too well from my constituency office that there are guidelines. There are procedures that my staff and I must follow, and we do follow them very stringently. It is important that businesses, offices, healthcare facilities and so on are aware of the guidelines they must follow, hence the necessity of this Bill. As I have said, if there had been a vote, we would have supported the Government, but it seems that that will not be the case tonight. Data exposure means the full potential for it to fall into the wrong hands, posing dangers to people and organisations, so it is great to be here to discuss how we can prevent that, with the Government presenting the legislation tonight and taking it through Committee when the time comes.

I have recently had some issues with data protection—this is a classic example of how mistakes can happen and how important data can end up in the wrong place—when in two instances the Independent Parliamentary Standards Authority accidentally published personal information about me and my staff online. It did not do it on purpose—it was an accident, and it did retrieve the data very quickly—but it has happened on two occasions at a time of severe threat in Northern Ireland and a level of threat on the mainland as well. Although the matter was quickly resolved, it is a classic example of the dangers posed to individuals.

I am sure Members are aware that the threat level in Northern Ireland has been increased. Despite there being external out-of-office security for Members, I have recently installed CCTV cameras in my office for the security of my staff, which, though not as great in comparison, is my responsibility. I have younger staff members in their 20s who live on their own, and staff who are parents of young children, and they deserve to know that they are safe. Anxieties have been raised because of the data disclosure, and I imagine that many others have experienced something similar.

I want to focus on issues about health. Ahead of this debate, I have been in touch with the British Medical Association, which raised completely valid concerns with me about the protection of health data. I have a number of questions to ask the Minister, if I may. The BMA’s understanding of the Bill is that the Secretary of State or the Minister will have significant discretionary powers to transfer large quantities of health information to third countries with minimal consultation or transparent assessment about how the information will benefit the UK. That is particularly worrying for me, and it should be worrying for everyone in this House. I am sure the Minister will give us some clarification and some reassurance, if that is possible, or tell us that this will not happen.

There is also concern about the Secretary of State having the power to transfer the same UK patients’ health data to a third country if it is thought that that would benefit the UK’s economic interests. I would be very disturbed, and quite annoyed and angry, that such a direction should be allowed. Again, the Minister may wish to comment on that at the end of the debate. I would be grateful if the Minister and his Department provided some clarity for the BMA about what the consultation process will be if information is to be shared with third-party countries or organisations.

There have also been concerns about whether large tech and social media companies are storing data correctly and upholding individuals’ rights or privacy correctly. We must always represent our constituents, and the Bill must ensure that the onus of care is placed on tech companies and organisations to legally store data safely and correctly. The safety and protection of data is paramount. We could not possibly vote for a Bill that undermined trust, furthered economic instability and eroded fundamental rights. Safeguards must be in place to protect people’s privacy, and that starts in the House today with this Bill. Can the Minister assure me and the BMA that our data will be protected and not shared willy-nilly with Tom, Dick and Harry? As I have said, protection is paramount, and we need to have it in place.

To conclude, we have heard numerous stories both from our constituents and in this place about the risks of ill-stored and unprotected data. The Bill must aim to retain high data protection standards without creating unnecessary barriers for individuals and businesses. I hope that the Minister and his Department can answer the questions we may have to ensure that the UK can be a frontrunner in safe and efficient data protection. We all want that goal. Let us make sure we go in the right direction to achieve it.

Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

I call the shadow Minister.

Data Protection and Digital Information Bill

Baroness Winterton of Doncaster Excerpts
Bill to be considered
Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

Mr Speaker has selected the recommittal motion in the name of Sir Chris Bryant. I call him to move the motion.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- View Speech - Hansard - - - Excerpts

I beg to move, That the clause be read a Second time.

Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

With this it will be convenient to discuss the following:

Government new clause 48—Processing of personal data revealing political opinions.

Government new clause 7—Searches in response to data subjects’ requests.

Government new clause 8—Notices from the Information Commissioner.

Government new clause 9—Court procedure in connection with subject access requests.

Government new clause 10—Approval of a supplementary code.

Government new clause 11—Designation of a supplementary code.

Government new clause 12—List of recognised supplementary codes.

Government new clause 13—Change to conditions for approval or designation.

Government new clause 14—Revision of a recognised supplementary code.

Government new clause 15—Applications for approval and re-approval.

Government new clause 16—Fees for approval, re-approval and continued approval.

Government new clause 17—Request for withdrawal of approval.

Government new clause 18—Removal of designation.

Government new clause 19—Registration of additional services.

Government new clause 20—Supplementary notes.

Government new clause 21—Addition of services to supplementary notes.

Government new clause 22—Duty to remove services from the DVS register.

Government new clause 23—Duty to remove supplementary notes from the DVS register.

Government new clause 24—Duty to remove services from supplementary notes.

Government new clause 25—Index of defined terms for Part 2.

Government new clause 26—Powers relating to verification of identity or status.

Government new clause 27—Interface bodies.

Government new clause 28—The FCA and financial services interfaces.

Government new clause 29—The FCA and financial services interfaces: supplementary.

Government new clause 30—The FCA and financial services interfaces: penalties and levies.

Government new clause 31—Liability and damages.

Government new clause 32—Other data provision.

Government new clause 33—Duty to notify the Commissioner of personal data breach: time periods.

Government new clause 34—Power to require information for social security purposes.

Government new clause 35—Retention of information by providers of internet services in connection with death of child.

Government new clause 36—Retention of biometric data and recordable offences.

Government new clause 37—Retention of pseudonymised biometric data.

Government new clause 38—Retention of biometric data from INTERPOL.

Government new clause 39—National Underground Asset Register.

Government new clause 40—Information in relation to apparatus.

Government new clause 41—Pre-commencement consultation.

Government new clause 42—Transfer of certain functions of Secretary of State.

New clause 1—Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision

“(1) The 2018 Act is amended in accordance with subsection (2).

(2) In the 2018 Act, after section 40 insert—

“40A Processing of data in relation to a case-file prepared by the police service for submission to the Crown Prosecution Service for a charging decision

(1) This section applies to a set of processing operations consisting of the preparation of a case-file by the police service for submission to the Crown Prosecution Service for a charging decision, the making of a charging decision by the Crown Prosecution Service, and the return of the case-file by the Crown Prosecution Service to the police service after a charging decision has been made.

(2) The police service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in preparing a case-file for submission to the Crown Prosecution Service for a charging decision.

(3) The Crown Prosecution Service is not obliged to comply with the first data protection principle except insofar as that principle requires processing to be fair, or the third data protection principle, in making a charging decision on a case-file submitted for that purpose by the police service.

(4) If the Crown Prosecution Service decides that a charge will not be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must take all steps reasonably required to destroy and delete all copies of the case-file in its possession.

(5) If the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service it must return the case-file to the police service and take all steps reasonably required to destroy and delete all copies of the case-file in its possession.

(6) Where the Crown Prosecution Service decides that a charge will be pursued when it makes a charging decision on a case-file submitted for that purpose by the police service and returns the case-file to the police service under subsection (5), the police service must comply with the first data protection principle and the third data protection principle in relation to any subsequent processing of the data contained in the case-file.

(7) For the purposes of this section—

(a) The police service means—

(i) constabulary maintained by virtue of an enactment, or

(ii) subject to section 126 of the Criminal Justice and Public Order Act 1994 (prison staff not to be regarded as in police service), any other service whose members have the powers or privileges of a constable.

(b) The preparation of, or preparing, a case-file by the police service for submission to the Crown Prosecution Service for a charging decision includes the submission of the file.

(c) A case-file includes all information obtained by the police service for the purpose of preparing a case-file for submission to the Crown Prosecution Service for a charging decision.””

This new clause adjusts Section 40 of the Data Protection Act 2018 to exempt the police service and the Crown Prosecution Service from the first and third data protection principles contained within the 2018 Act so that they can share unredacted data with one another when making a charging decision.

New clause 2—Common standards and timeline for implementation

“(1) Within one month of the passage of this Act, the Secretary of State must by regulations require those appointed as decision-makers to create, publish and update as required open and common standards for access to customer data and business data.

(2) Standards created by virtue of subsection (1) must be interoperable with those created as a consequence of Part 2 of the Retail Banking Market Investigation Order 2017, made by the Competition and Markets Authority.

(3) Regulations under section 66 and 68 must ensure interoperability of customer data and business data with standards created by virtue of subsection (1).

(4) Within one month of the passage of this Act, the Secretary of State must publish a list of the sectors to which regulations under section 66 and section 68 will apply within three years of the passage of the Act, and the date by which those regulations will take effect in each case.”

This new clause, which is intended to be placed in Part 3 (Customer data and business data) of the Bill, would require interoperability across all sectors of the economy in smart data standards, including the Open Banking standards already in effect, and the publication of a timeline for implementation.

New clause 3—Provision about representation of data subjects

“(1) Section 190 of the Data Protection Act 2018 is amended as follows.

(2) In subsection (1), leave out “After the report under section 189(1) is laid before Parliament, the Secretary of State may” and insert “The Secretary of State must, within three months of the passage of the Data Protection and Digital Information Act 2024,”.”

This new clause would require the Secretary of State to exercise powers under s190 DPA2018 to allow organisations to raise data breach complaints on behalf of data subjects generally, in the absence of a particular subject who wishes to bring forward a claim about misuse of their own personal data.

New clause 4—Review of notification of changes of circumstances legislation

“(1) The Secretary of State must commission a review of the operation of the Social Security (Notification of Changes of Circumstances) Regulations 2010.

(2) In conducting the review, the designated reviewer must—

(a) consider the current operation and effectiveness of the legislation;

(b) identify any gaps in its operation and provisions;

(c) consider and publish recommendations as to how the scope of the legislation could be expanded to include non-public sector, voluntary and private sector holders of personal data.

(3) In undertaking the review, the reviewer must consult—

(a) specialists in data sharing;

(b) people and organisations who campaign for the interests of people affected by the legislation;

(c) people and organisations who use the legislation;

(d) any other persons and organisations the review considers appropriate.

(4) The Secretary of State must lay a report of the review before each House of Parliament within six months of this Act coming into force.”

This new clause requires a review of the operation of the “Tell Us Once” programme, which seeks to provide simpler mechanisms for citizens to pass information regarding births and deaths to government, and consideration of whether the progress of “Tell Us Once” could be extended to non-public sector holders of data.

New clause 5—Definition of “biometric data”

“Article 9 of the UK GDPR is amended by the omission, in paragraph 1, of the words “for the purpose of uniquely identifying a natural person”.”

This new clause would amend the UK General Data Protection Regulation to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.

New clause 43—Right to use non-digital verification services

“(1) This section applies when an organisation—

(a) requires an individual to use a verification service, and

(b) uses a digital verification service for that purpose.

(2) The organisation—

(a) must make a non-digital alternative method of verification available to any individual required to use a verification service, and

(b) must provide information about digital and non-digital methods of verification to those individuals before verification is required.”

This new clause, which is intended for insertion into Part 2 of the Bill (Digital verification services), creates the right for data subjects to use non-digital identity verification services as an alternative to digital verification services, thereby preventing digital verification from becoming mandatory in certain settings.

New clause 44—Transfer of functions to the Investigatory Powers Commissioner’s Office

“The functions of the Surveillance Camera Commissioner are transferred to the Investigatory Powers Commissioner.”

New clause 45—Interoperability of data and collection of comparable healthcare statistics across the UK

“(1) The Health and Social Care Act 2012 is amended as follows.

(2) After section 250, insert the following section—

“250A Interoperability of data and collection of comparable healthcare statistics across the UK

(1) The Secretary of State must prepare and publish an information standard specifying binding data interoperability requirements which apply across the whole of the United Kingdom.

(2) An information standard prepared and published under this section—

(a) must include guidance about the implementation of the standard;

(b) may apply to any public body which exercises functions in connection with the provision of health services anywhere in the United Kingdom.

(3) A public body to which an information standard prepared and published under this section applies must have regard to the standard.

(4) The Secretary of State must report to Parliament each year on progress on the implementation of an information standard prepared in accordance with this section.

(5) For the purposes of this section—

“health services” has the same meaning as in section 250 of this Act, except that for “in England” there is substituted “anywhere in the United Kingdom”, and “the health service” in parts of the United Kingdom other than England has the meaning given by the relevant statute of that part of the United Kingdom;

“public body” has the same meaning as in section 250 of this Act.”

(3) In section 254 (Powers to direct NHS England to establish information systems), after subsection (2), insert—

“(2A) The Secretary of State must give a direction under subsection (1) directing NHS England to collect and publish information about healthcare performance and outcomes in all parts of the United Kingdom in a way which enables comparison between different parts of the United Kingdom.

(2B) Before giving a direction by virtue of subsection (2A), the Secretary of State must consult—

(a) the bodies responsible for the collection and publication of official statistics in each part of the United Kingdom,

(b) Scottish Ministers,

(c) Welsh Ministers, and

(d) Northern Ireland departments.

(2C) The Secretary of State may not give a direction by virtue of subsection (2A) unless a copy of the direction has been laid before, and approved by resolution of, both Houses of Parliament.

(2D) Scottish Ministers, Welsh Ministers and Northern Ireland departments must arrange for the information relating to the health services for which they have responsibility described in the direction given by virtue of subsection (2A) to be made available to NHS England in accordance with the direction.

(2E) For the purposes of a direction given by virtue of subsection (2A), the definition of “health and social care body” given in section 259(11) applies as if for “England” there were substituted “the United Kingdom”.””

New clause 46—Assessment of impact of Act on EU adequacy

“(1) Within six months of the passage of this Act, the Secretary of State must carry out an assessment of the impact of the Act on EU adequacy, and lay a report of that assessment before both Houses of Parliament.

(2) The report must assess the impact on—

(a) data risk, and

(b) small and medium-sized businesses.

(3) The report must quantify the impact of the Act in financial terms.”

New clause 47—Review of the impact of the Act on anonymisation and the identifiability of data subjects

“(1) Within six months of the passage of this Act, the Secretary of State must lay before Parliament the report of an assessment of the impact of the measures in the Act on anonymisation and the identifiability of data subjects.

(2) The report must include a comparison between the rights afforded to data subjects under this Act with those afforded to data subjects by the EU General Data Protection Regulation.”

Amendment 278, in clause 5, page 6, line 15, leave out paragraphs (b) and (c).

This amendment and Amendment 279 would remove the power for the Secretary of State to create pre-defined and pre-authorised “recognised legitimate interests”, for data processing. Instead, the current test would continue to apply in which personal data can only be processed in pursuit of a legitimate interest, as balanced with individual rights and freedoms.

Amendment 279, page 6, line 23, leave out subsections (4), (5) and (6).

See explanatory statement to Amendment 278.

Amendment 230, page 7, leave out lines 1 and 2 and insert—

“8. The Secretary of State may not make regulations under paragraph 6 unless a draft of the regulations has been laid before both Houses of Parliament for the 60-day period.

8A. The Secretary of State must consider any representations made during the 60-day period in respect of anything in the draft regulations laid under paragraph 8.

8B. If, after the end of the 60-day period, the Secretary of State wishes to proceed to make the regulations, the Secretary of State must lay before Parliament a draft of the regulations (incorporating any changes the Secretary of State considers appropriate pursuant to paragraph 8A).

8C. Draft regulations laid under paragraph 8B must, before the end of the 40-day period, have been approved by a resolution of each House of Parliament.

8D. In this Article—

“the 40-day period” means the period of 40 days beginning on the day on which the draft regulations mentioned in paragraph 8 are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid);

“the 60-day period” means the period of 60 days beginning on the day on which the draft regulations mentioned in paragraph 8B are laid before Parliament (or, if it is not laid before each House of Parliament on the same day, the later of the days on which it is laid).

8E. When calculating the 40-day period or the 60-day period for the purposes of paragraph 8D, ignore any period during which Parliament is dissolved or prorogued or during which both Houses are adjourned for more than 4 days.”

This amendment would make regulations made in respect of recognised legitimate interest subject to a super-affirmative Parliamentary procedure.

Amendment 11, page 7, line 12, at end insert—

““internal administrative purposes” , in relation to special category data, means the conditions set out for lawful processing in paragraph 1 of Schedule 1 of the Data Protection Act 2018.”

This amendment clarifies that the processing of special category data in employment must follow established principles for reasonable processing, as defined by paragraph 1 of Schedule 1 of the Data Protection Act 2018.

Government amendment 252.

Amendment 222, page 10, line 8, leave out clause 8.

Amendment 3, in clause 8, page 10, leave out line 31.

This amendment would mean that the resources available to the controller could not be taken into account when determining whether a request is vexatious or excessive.

Amendment 2, page 11, line 34, at end insert—

“(6A) When informing the data subject of the reasons for not taking action on the request in accordance with subsection (6), the controller must provide evidence of why the request has been treated as vexatious or excessive.”

This amendment would require the data controller to provide evidence of why a request has been considered vexatious or excessive if the controller is refusing to take action on the request.

Government amendment 17.

Amendment 223, page 15, line 22, leave out clause 10.

Amendment 224, page 18, line 7, leave out clause 12.

Amendment 236, in clause 12, page 18, line 21, at end insert—

“(c) a data subject is an identified or identifiable individual who is affected by a significant decision, irrespective of the direct presence of their personal data in the decision-making process.”

This amendment would clarify that a “data subject” includes identifiable individuals who are subject to data-based and automated decision-making, whether or not their personal data is directly present in the decision-making process.

Amendment 232, page 19, line 12, leave out “solely” and insert “predominantly”.

This amendment would mean safeguards for data subjects’ rights, freedoms and legitimate interests would have to be in place in cases where a significant decision in relation to a data subject was taken based predominantly, rather than solely, on automated processing.

Amendment 5, page 19, line 12, after “solely” insert “or partly”.

This amendment would mean that the protections provided for by the new Article 22C would apply where a decision is based either solely or partly on automated processing, not only where it is based solely on such processing.

Amendment 233, page 19, line 18, at end insert

“including the reasons for the processing.”

This amendment would require data controllers to provide the data subject with the reasons for the processing of their data in cases where a significant decision in relation to a data subject was taken based on automated processing.

Amendment 225, page 19, line 18, at end insert—

“(aa) require the controller to inform the data subject when a decision described in paragraph 1 has been taken in relation to the data subject;”.

Amendment 221, page 20, line 3, at end insert—

“7. When exercising the power to make regulations under this Article, the Secretary

of State must have regard to the following statement of principles:

Digital information principles at work

1. People should have access to a fair, inclusive and trustworthy digital environment

at work.

2. Algorithmic systems should be designed and used to achieve better outcomes:

to make work better, not worse, and not for surveillance. Workers and their

representatives should be involved in this process.

3. People should be protected from unsafe, unaccountable and ineffective

algorithmic systems at work. Impacts on individuals and groups must be assessed

in advance and monitored, with reasonable and proportionate steps taken.

4. Algorithmic systems should not harm workers’ mental or physical health, or

integrity.

5. Workers and their representatives should always know when an algorithmic

system is being used, how and why it is being used, and what impacts it may

have on them or their work.

6. Workers and their representatives should be involved in meaningful consultation

before and during use of an algorithmic system that may significantly impact

work or people.

7. Workers should have control over their own data and digital information collected

about them at work.

8. Workers and their representatives should always have an opportunity for human

contact, review and redress when an algorithmic system is used at work where

it may significantly impact work or people. This includes a right to a written

explanation when a decision is made.

9. Workers and their representatives should be able to use their data and digital

technologies for contact and association to improve work quality and conditions.

10. Workers should be supported to build the information, literacy and skills needed

to fulfil their capabilities through work transitions.”

This amendment would insert into new Article 22D of the UK GDPR a requirement for the Secretary of State to have regard to the statement of digital information principles at work when making regulations about automated decision-making.

Amendment 4, in clause 15, page 25, line 4, at end insert

“(including in the cases specified in sub-paragraphs (a) to (c) of paragraph 3 of Article 35)”.

This amendment, together with Amendment 1, would provide a definition of what constitutes “high risk processing” for the purposes of applying Articles 27A, 27B and 27C, which require data controllers to designate, and specify the duties of, a “senior responsible individual” with responsibility for such processing.

Government amendments 18 to 44.

Amendment 12, in page 32, line 7, leave out clause 17.

This amendment keeps the current requirement on police in the Data Protection Act 2018 to justify why they have accessed an individual’s personal data.

Amendment 1, in clause 18, page 32, line 18, leave out paragraph (c) and insert—

“(c) omit paragraph 2,

(ca) in paragraph 3—

(i) for “data protection” substitute “high risk processing”,

(ii) in sub-paragraph (a), for “natural persons” substitute “individuals”,

(iii) in sub-paragraph (a) for “natural person” substitute “individual” in both places where it occurs,

(cb) omit paragraphs 4 and 5,”.

This amendment would leave paragraph 3 of Article 35 of the UK GDPR in place (with amendments reflecting amendments made by the Bill elsewhere in the Article), thereby ensuring that there is a definition of “high risk processing” on the face of the Regulation.

Amendment 226, page 39, line 38, leave out clause 26.

Amendment 227, page 43, line 2, leave out clause 27.

Amendment 228, page 46, line 32, leave out clause 28.

Government amendment 45.

Amendment 235, page 57, line 29, leave out clause 34.

This amendment would leave in place the existing regime, which refers to “manifestly unfounded” or excessive requests to the Information Commissioner, rather than the proposed change to “vexatious” or excessive requests.

Government amendments 46 and 47.

Amendment 237, in clause 48, page 77, line 4, leave out “individual” and insert “person”.

This amendment and Amendments 238 to 240 are intended to enable the digital verification services covered by the Bill to include verification of organisations as well as individuals.

Amendment 238, page 77, line 5, leave out “individual” and insert “person”.

See explanatory statement to Amendment 237.

Amendment 239, page 77, line 6, leave out “individual” and insert “person”.

See explanatory statement to Amendment 237.

Amendment 240, page 77, line 7, leave out “individual” and insert “person”.

See explanatory statement to Amendment 237.

Amendment 241, page 77, line 8, at end insert (on new line)—

“and the facts which may be so ascertained, verified or confirmed may include the fact that an individual has a claimed connection with a legal person.”

This amendment would ensure that the verification services covered by the Bill will include verification that an individual has a claimed connection with a legal person.

Government amendments 48 to 50.

Amendment 280, in clause 49, page 77, line 13, at end insert—

“(2A) The DVS trust framework must include a description of how the provision of digital verification services is expected to uphold the Identity Assurance Principles.

(2B) Schedule (Identity Assurance Principles) describes each Identity Assurance Principle and its effect.”

Amendment 281, page 77, line 13, at end insert—

“(2A) The DVS trust framework must allow valid attributes to be protected by zero-knowledge proof and other decentralised technologies, without restriction upon how and by whom those proofs may be held or processed.”

Government amendments 51 to 66.

Amendment 248, in clause 52, page 79, line 7, at end insert—

“(1A) A determination under subsection (1) may specify an amount which is tiered to the size of the person and its role as specified in the DVS trust framework.”

This amendment would enable fees for application for registration in the DVS register to be determined on the basis of the size and role of the organisation applying to be registered.

Amendment 243, page 79, line 8, after “may”, insert “not”.

This amendment would provide that the fee for application for registration in the DVS register could not exceed the administrative costs of determining the application.

Government amendment 67.

Amendment 244, page 79, line 13, after “may”, insert “not”.

This amendment would provide that the fee for continued registration in the DVS register could not exceed the administrative costs of that registration.

Government amendment 68.

Amendment 245, page 79, line 21, at end insert—

“(10) The fees payable under this section must be reviewed every two years by the National Audit Office.”

This amendment would provide that the fees payable for DVS registration must be reviewed every two years by the NAO.

Government amendments 69 to 77.

Amendment 247, in clause 54, page 80, line 38, after “person”, insert “or by other parties”.

This amendment would enable others, for example independent experts, to make representations about a decision to remove a person from the DVS register, as well as the person themselves.

Amendment 246, page 81, line 7, at end insert—

“(11) The Secretary of State may not exercise the power granted by subsection (1) until the Secretary of State has consulted on proposals for how a decision to remove a person from the DVS register will be reached, including—

(a) how information will be collected from persons impacted by a decision to remove the person from the register, and from others;

(b) how complaints will be managed;

(c) how evidence will be reviewed;

(d) what the burden of proof will be on which a decision will be based.”

This amendment would provide that the power to remove a person from the DVS register could not be exercised until the Secretary of State had consulted on the detail of how a decision to remove would be reached.

Government amendments 78 to 80.

Amendment 249, in clause 62, page 86, line 17, at end insert—

“(3A) A notice under this section must give the recipient of the notice an opportunity to consult the Secretary of State on the content of the notice before providing the information required by the notice.”

This amendment would provide an option for consultation between the Secretary of State and the recipient of an information notice before the information required by the notice has to be provided.

Government amendment 81.

Amendment 242, in clause 63, page 87, line 21, leave out “may” and insert “must”.

This amendment would require the Secretary of State to make arrangements for a person to exercise the Secretary of State’s functions under this Part of the Bill, so that an independent regulator would perform the relevant functions and not the Secretary of State.

Amendment 250, in clause 64, page 87, line 34, at end insert—

“(1A) A report under subsection (1) must include a report on any arrangements made under section 63 for a third party to exercise functions under this Part.”

This amendment would require information about arrangements for a third party to exercise functions under this Part of the Bill to be included in the annual reports on the operation of the Part.

Government amendments 82 to 196.

Amendment 6, in clause 83, page 107, leave out from line 26 to the end of line 34 on page 108.

This amendment would leave out the proposed new regulation 6B of the PEC Regulations, which would enable consent to be given, or an objection to be made, to cookies automatically.

Amendment 217, page 109, line 20, leave out clause 86.

This amendment would leave out the clause which would enable the sending of direct marketing electronic mail on a “soft opt-in” basis.

Amendment 218, page 110, line 1, leave out clause 87.

This amendment would remove the clause which would enable direct marketing for the purposes of democratic engagement. See also Amendment 220.

Government amendments 253 to 255.

Amendment 219, page 111, line 6, leave out clause 88.

This amendment is consequential on Amendment 218.

Government amendments 256 to 265.

Amendment 7, in clause 89, page 114, line 12, at end insert—

“(2A) A provider of a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty under this regulation.”

This amendment would clarify that a public electronic communications service or network is not required to intercept or examine the content of any communication in order to comply with their duty to notify the Commissioner of unlawful direct marketing.

Amendment 8, page 117, line 3, at end insert—

“(5) In regulation 1—

(a) at the start, insert “(1)”;

(b) after “shall”, insert “save for regulation 26A”;

(c) at end, insert—

“(2) Regulation 26A comes into force six months after the Commissioner has published guidance under regulation 26C (Guidance in relation to regulation 26A).””

This amendment would provide for the new regulation 26A, Duty to notify Commissioner of unlawful direct marketing, not to come into force until six months after the Commissioner has published guidance in relation to that duty.

Government amendment 197.

Amendment 251, in clause 101, page 127, line 3, leave out “and deaths” and insert “, deaths and deed polls”.

This amendment would require deed poll information to be kept to the same standard as records of births and deaths.

Amendment 9, page 127, line 24, at end insert—

“(2A) After section 25, insert—

“25A Review of form in which registers are to be kept

(1) The Secretary of State must commission a review of the provisions of this Act and of related legislation, with a view to the creation of a single digital register of births and deaths.

(2) The review must consider and make recommendations on the effect of the creation of a single digital register on—

(a) fraud,

(b) data collection, and

(c) ease of registration.

(3) The Secretary of State must lay a report of the review before each House of Parliament within six months of this section coming into force.””

This amendment would insert a new section into the Births and Deaths Registration Act 1953 requiring a review of relevant legislation, with consideration of creating a single digital register for registered births and registered deaths and recommendations on the effects of such a change on reducing fraud, improving data collection and streamlining digital registration.

Government amendment 198.

Amendment 229, in clause 112, page 135, line 8, leave out subsections (2) and (3).

Amendment 10, in clause 113, page 136, line 35, leave out

“which allows or confirms the unique identification of that individual”.

This amendment would amend the definition of “biometric data” for the purpose of the oversight of law enforcement biometrics databases so as to extend the protections currently in place for biometric data for identification to include biometric data for the purpose of classification.

Government amendments 199 to 207.

Government new schedule 1—Power to require information for social security purposes.

Government new schedule 2—National Underground Asset Register: monetary penalties.

New schedule 3—Identity Assurance Principles

“Part 1

Definitions

1 These Principles are limited to the processing of Identity Assurance Data (IdA Data) in an Identity Assurance Service (e.g. establishing and verifying identity of a Service User; conducting a transaction that uses a user identity; maintaining audit requirements in relation a transaction associated with the use of a service that needs identity verification etc.). They do not cover, for example, any data used to deliver a service, or to measure its quality.

2 In the context of the application of the Identity Assurance Principles to an Identity Assurance Service, “Identity Assurance Data” (“IdA Data”) means any recorded information that is connected with a “Service User” including—

“Audit Data.” This includes any recorded information that is connected with any log or audit associated with an Identity Assurance Service.

“General Data.” This means any other recorded information which is not personal data, audit data or relationship data, but is still connected with a “Service User”.

“Personal Data.” This takes its meaning from the Data Protection Act 2018 or subsequent legislation (e.g. any recorded information that relates to a “Service User” who is also an identified or identifiable living individual).

“Relationship Data.” This means any recorded information that describes (or infers) a relationship between a “Service User”, “Identity Provider” or “Service Provider” with another “Service User”, “Identity Provider” or “Service Provider” and includes any cookie or program whose purpose is to supply a means through which relationship data are collected.

3 Other terms used in relation to the Principles are defined as follows—

“save-line2Identity Assurance Service.” This includes relevant applications of the technology (e.g. hardware, software, database, documentation) in the possession or control of any “Service User”, “Identity Provider” or “Service Provider” that is used to facilitate identity assurance activities; it also includes any IdA Data processed by that technology or by an Identity Provider or by a Service Provider in the context of the Service; and any IdA Data processed by the underlying infrastructure for the purpose of delivering the IdA service or associated billing, management, audit and fraud prevention.

“Identity Provider.” This means the certified individual or certified organisation that provides an Identity Assurance Service (e.g. establishing an identity, verification of identity); it includes any agent of a certified Identity Provider that processes IdA data in connection with that Identity Assurance Service.

“Participant.” This means any “Identity Provider”, “Service Provider” or “Service User” in an Identity Assurance Service. A “Participant” includes any agent by definition.

“Processing.” In the context of IdA data means “collecting, using, disclosing, retaining, transmitting, copying, comparing, corroborating, correlating, aggregating, accessing” the data and includes any other operation performed on IdA data.

“Provider.” Includes both “Identity Provider” and/or “Service Provider”.

“Service Provider.” This means the certified individual or certified organisation that provides a service that uses an Identity Provider in order to verify identity of the Service User; it includes any agent of the Service Provider that processes IdA data from an Identity Assurance Service.

“Service User.” This means the person (i.e. an organisation (incorporated or not)) or an individual (dead or alive) who has established (or is establishing) an identity with an Identity Provider; it includes an agent (e.g. a solicitor, family member) who acts on behalf of a Service User with proper authority (e.g. a public guardian, or a Director of a company, or someone who possesses power of attorney). The person may be living or deceased (the identity may still need to be used once its owner is dead, for example by an executor).

“Third Party.” This means any person (i.e. any organisation or individual) who is not a “Participant” (e.g. the police or a Regulator).

Part 2

The Nine Identity Assurance Principles

Any exemptions from these Principles must be specified via the “Exceptional Circumstances Principle”. (See Principle 9).

1 User Control Principle

Statement of Principle: “I can exercise control over identity assurance activities affecting me and these can only take place if I consent or approve them.”

1.1 An Identity Provider or Service Provider must ensure any collection, use or disclosure of IdA data in, or from, an Identity Assurance Service is approved by each particular Service User who is connected with the IdA data.

1.2 There should be no compulsion to use the Identity Assurance Service and Service Providers should offer alternative mechanisms to access their services. Failing to do so would undermine the consensual nature of the service.

2 Transparency Principle

Statement of Principle: “Identity assurance can only take place in ways I understand and when I am fully informed.”

2.1 Each Identity Provider or Service Provider must be able to justify to Service Users why their IdA data are processed. Ensuring transparency of activity and effective oversight through auditing and other activities inspires public trust and confidence in how their details are used.

2.2 Each Service User must be offered a clear description about the processing of IdA data in advance of any processing. Identity Providers must be transparent with users about their particular models for service provision.

2.3 The information provided includes a clear explanation of why any specific information has to be provided by the Service User (e.g. in order that a particular level of identity assurance can be obtained) and identifies any obligation on the part of the Service User (e.g. in relation to the User’s role in securing his/her own identity information).

2.4 The Service User will be able to identify which Service Provider they are using at any given time.

2.5 Any subsequent and significant change to the processing arrangements that have been previously described to a Service User requires the prior consent or approval of that Service User before it comes into effect.

2.6 All procedures, including those involved with security, should be made publicly available at the appropriate time, unless such transparency presents a security or privacy risk. For example, the standards of encryption can be identified without jeopardy to the encryption keys being used.

3 Multiplicity Principle

Statement of Principle: “I can use and choose as many different identifiers or identity providers as I want to.”

3.1 A Service User is free to use any number of identifiers that each uniquely identifies the individual or business concerned.

3.2 A Service User can use any of his identities established with an Identity Provider with any Service Provider.

3.3 A Service User shall not be obliged to use any Identity Provider or Service Provider not chosen by that Service User; however, a Service Provider can require the Service User to provide a specific level of Identity Assurance, appropriate to the Service User’s request to a Service Provider.

3.4 A Service User can choose any number of Identity Providers and where possible can choose between Service Providers in order to meet his or her diverse needs. Where a Service User chooses to register with more than one Identity Provider, Identity Providers and Service Providers must not link the Service User’s different accounts or gain information about their use of other Providers.

3.5 A Service User can terminate, suspend or change Identity Provider and where possible can choose between Service Providers at any time.

3.6 A Service Provider does not know the identity of the Identity Provider used by a Service User to verify an identity in relation to a specific service. The Service Provider knows that the Identity Provider can be trusted because the Identity Provider has been certified, as set out in GPG43 – Requirements for Secure Delivery of Online Public Services (RSDOPS).

4 Data Minimisation Principle

Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”

4.1 Identity Assurance should only be used where a need has been established and only to the appropriate minimum level of assurance.

4.2 Identity Assurance data processed by an Identity Provider or a Service Provider to facilitate a request of a Service User must be the minimum necessary in order to fulfil that request in a secure and auditable manner.

4.3 When a Service User stops using a particular Identity Provider, their data should be deleted. Data should be retained only where required for specific targeted fraud, security or other criminal investigation purposes.

5 Data Quality Principle

Statement of Principle: “My interactions only use the minimum data necessary to meet my needs.”

5.1 Service Providers should enable Service Users (or authorised persons, such as the holder of a Power of Attorney) to be able to update their own personal data, at a time at their choosing, free of charge and in a simple and easy manner.

5.2 Identity Providers and Service Providers must take account of the appropriate level of identity assurance required before allowing any updating of personal data.

6 Service User Access and Portability Principle

Statement of Principle: “I have to be provided with copies of all of my data on request; I can move/remove my data whenever I want.”

6.1 Each Identity Provider or Service Provider must allow, promptly, on request and free of charge, each Service User access to any IdA data that relates to that Service User.

6.2 It shall be unlawful to make it a condition of doing anything in relation to a Service User to request or require that Service User to request IdA data.

6.3 The Service User must be able to require an Identity Provider to transfer his personal data, to a second Identity Provider in a standard electronic format, free of charge and without impediment or delay.

7 Certification Principle

Statement of Principle: “I can have confidence in the Identity Assurance Service because all the participants have to be certified against common governance requirements.”

7.1 As a baseline control, all Identity Providers and Service Providers will be certified against a shared standard. This is one important way of building trust and confidence in the service.

7.2 As part of the certification process, Identity Providers and Service Providers are obliged to co-operate with the independent Third Party and accept their impartial determination and to ensure that contractual arrangements—

• reinforce the application of the Identity Assurance Principles

• contain a reference to the independent Third Party as a mechanism for dispute resolution.

7.3 In the context of personal data, certification procedures include the use of Privacy Impact Assessments, Security Risk Assessments, Privacy by Design concepts and, in the context of information security, a commitment to using appropriate technical measures (e.g. encryption) and ever improving security management. Wherever possible, such certification processes and security procedures reliant on technical devices should be made publicly available at the appropriate time.

7.4 All Identity Providers and Service Providers will take all reasonable steps to ensure that a Third Party cannot capture IdA data that confirms (or infers) the existence of relationship between any Participant. No relationships between parties or records should be established without the consent of the Service User.

7.5 Certification can be revoked if there is significant non-compliance with any Identity Assurance Principle.

8 Dispute Resolution Principle

Statement of Principle: “If I have a dispute, I can go to an independent Third Party for a resolution.”

8.1 A Service User who, after a reasonable time, cannot, or is unable, to resolve a complaint or problem directly with an Identity Provider or Service Provider can call upon an independent Third Party to seek resolution of the issue. This could happen for example where there is a disagreement between the Service User and the Identity Provider about the accuracy of data.

8.2 The independent Third Party can resolve the same or similar complaints affecting a group of Service Users.

8.3 The independent Third Party can co-operate with other regulators in order to resolve problems and can raise relevant issues of importance concerning the Identity Assurance Service.

8.4 An adjudication/recommendation of the independent Third Party should be published. The independent Third Party must operate transparently, but detailed case histories should only be published subject to appropriate review and consent.

8.5 There can be more than one independent Third Party.

8.6 The independent Third Party can recommend changes to standards or certification procedures or that an Identity Provider or Service Provider should lose their certification.

9 Exceptional Circumstances Principle

Statement of Principle: “Any exception has to be approved by Parliament and is subject to independent scrutiny.”

9.1 Any exemption from the application of any of the above Principles to IdA data shall only be lawful if it is linked to a statutory framework that legitimises all Identity Assurance Services, or an Identity Assurance Service in the context of a specific service. In the absence of such a legal framework then alternative measures must be taken to ensure, transparency, scrutiny and accountability for any exceptions.

9.2 Any exemption from the application of any of the above Principles that relates to the processing of personal data must also be necessary and justifiable in terms of one of the criteria in Article 8(2) of the European Convention of Human Rights: namely in the interests of national security; public safety or the economic well-being of the country; for the prevention of disorder or crime; for the protection of health or morals, or for the protection of the rights and freedoms of others.

9.3 Any subsequent processing of personal data by any Third Party who has obtained such data in exceptional circumstances (as identified by Article 8(2) above) must be the minimum necessary to achieve that (or another) exceptional circumstance.

9.4 Any exceptional circumstance involving the processing of personal data must be subject to a Privacy Impact Assessment by all relevant “data controllers” (where “data controller” takes its meaning from the Data Protection Act).

9.5 Any exemption from the application of any of the above Principles in relation to IdA data shall remain subject to the Dispute Resolution Principle.”

Amendment 220, in schedule 1, page 141, leave out from line 21 to the end of line 36 on page 144.

This amendment would remove from the new Annex 1 of the UK GDPR provisions which would enable direct marketing for the purposes of democratic engagement. See also Amendment 218.

Government amendments 266 to 277.

Government amendments 208 to 211.

Amendment 15, in schedule 5, page 154, line 2, at end insert—

“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”

This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer.

Amendment 14, page 154, line 25, at end insert—

“5. In relation to special category data, the Information Commissioner must assess whether the data protection test is met for data transfer to a third country or international organisation.”

This amendment requires the Information Commission to assess suitability for international transfer of special category data to a third country or international organisation.

Amendment 13, page 154, line 30, leave out “ongoing” and insert “annual”.

This amendment mandates that a country’s suitability for international transfer of data is monitored on an annual basis.

Amendment 16, in schedule 6, page 162, line 36, at end insert—

“(g) the views of the Information Commission on suitability of international transfer of data to the country or organisation.”

This amendment requires the Secretary of State to seek the views of the Information Commission on whether a country or organisation has met the data protection test for international data transfer in relation to law enforcement processing.

Government amendment 212.

Amendment 231, in schedule 13, page 202, line 33, at end insert—

“(2A) A person may not be appointed under sub-paragraph (2) unless the Science, Innovation and Technology Committee of the House of Commons has endorsed the proposed appointment.”

This amendment would ensure that non-executive members of the Information Commission may not be appointed unless the Science, Innovation and Technology Committee has endorsed the Secretary of State’s proposed appointee.

Government amendments 213 to 216.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The current one-size-fits-all, top-down approach to data protection that we inherited from the European Union has led to public confusion, which has impeded the effective use of personal data to drive growth and competition, and to support key innovations. The Bill seizes on a post-Brexit opportunity to build on our existing foundations and create an innovative, flexible and risk-based data protection regime. This bespoke model will unlock the immense possibilities of data use to improve the lives of everyone in the UK, and help make the UK the most innovative society in the world through science and technology.

I want to make it absolutely clear that the Bill will continue to maintain the highest standards of data protection that the British people rightly expect, but it will also help those who use our data to make our lives healthier, safer and more prosperous. That is because we have convened industry leaders and experts to co-design the Bill at every step of the way. We have held numerous roundtables with both industry experts in the field and campaigning groups. The outcome, I believe, is that the legislation will ensure our regulation reflects the way real people live their lives and run their businesses.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I do have a note on interface bodies, which I am happy to include for the benefit of my hon. Friend. However, he will be aware that this is a technical and complicated area. If he wants to pursue a further discussion, I would of course be happy to oblige. I can tell him that the amendments will ensure that smart data schemes can replicate and build on the open banking model by allowing the Government to require interface bodies to be set up by members of the scheme. Interface bodies will play a similar role to that of the open banking implementation entity, developing common standards on arrangements for data sharing. Learning from the lessons and successes of the open banking regime, regulations will be able to specify the responsibilities and requirements for interface bodies and ensure appropriate accountability to regulators. I hope that that goes some way to addressing the point that he makes, but I would be happy to discuss it further with him in due course.

I believe these amendments will generally improve the functioning of the Bill and address some specific concerns that I have identified. On that basis, I commend them to the House.

Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- Hansard - -

I call the shadow Minister.

Chris Bryant Portrait Sir Chris Bryant
- Hansard - - - Excerpts

As I am feeling generous, I shall start with the nice bits where we agree with the Government. First, we completely agree with the changes to the Information Commissioner’s Office, strengthening the ICO’s enforcement powers, restructuring the ICO and providing a clearer framework of objectives. As the Minister knows, we have always been keen to strengthen the independence of the ICO and we were concerned that the Government were taking new interventionist powers—that is quite a theme in this Bill—in clause 33, so we welcome Government amendment 45, which achieves a much better balance between democratic oversight and ICO independence, so we thank the Minister for that.

Labour also welcomes part 2 of the Bill, as amended in Committee, establishing a digital verification framework. My concern, however, is that the Government have underestimated the sheer technicality of such an endeavour, hence the last-minute requirement for tens of Government amendments to this part of the Bill, which I note the Minister keeps on referring to as being very technical and therefore best to be debated in another place at another time with officials present. Under Government amendment 52, for example, different rules will be established for different digital verification services, and I am not quite sure whether that will stand the test of the House of Lords.

We warmly welcome and support part 3 of the Bill, which has just been referred to by the hon. Member for Weston-super-Mare (John Penrose) and the Minister, and its provisions on smart data. Indeed, we and many industry specialists have been urging the Government to go much faster in this particular area. The potential for introducing smart data schemes is vast, empowering consumers to make financial decisions that better suit them, enabling innovation and delivering better products and services. Most notably, that has already happened in relation to financial services. Many people will not know that that is what they are using when they use a software that is accessing several different bank accounts, but that is what they are doing.

In the autumn statement, the Government pledged to kickstart a smart data big bang. One area where smart data has been most effective is in open finance—it is right that we expand these provisions into new areas to have a greater social impact—but, to quote the Financial Conduct Authority, it should be implemented there

“in a proportionate phased manner, ideally driven by consideration of credible consumer propositions and use-cases.”

Furthermore, the FCA does not think that a big bang approach to open finance is feasible or desirable. Nevertheless, many of the Government amendments to the suite of smart data provisions are technical, and indicate a move in the right direction. In particular, we hope that, with smart data enabling greater access by consumers to information about green options and net zero, we will be able to help the whole of the UK to move towards net zero.

I want to say a few words on part 4, on cookies and nuisance calls. We share a lot of the Government’s intentions on tackling those issues and the births and deaths register. As a former registrar, I would like to see tombstoning—the process of fraudulently adopting for oneself the name of a child who has died—brought to an end. That practice is enabled partly because the deaths register does not actually register the death of an individual named on the births register, which I hope will at some point be possible.

Despite the Government’s having sat on the Bill for almost 18 months, with extensive consultations, drafts, amendments and carry-over motions, there are still big practical holes in these measures that need to be addressed. Labour supports the Government’s ambitions to tackle nuisance calls, which are a blight on people’s lives—we all know that. However, I fear that clause 89, which establishes a duty to notify the ICO of unlawful direct marketing, will make little or no difference without the addition of Labour amendments 7 and 8, which would implement those obligations on electronic communications companies when the guidance from the ICO on their practical application has been clearly established. As the Bill stands, that is little more than wishful thinking.

Unfortunately, the story is the same on tackling cookies. We have a bunch of half-baked measures that simply do not deliver as the public will expect them to and the Government would like them to. We all support reducing cookie fatigue; I am sure that every hon. Member happily clicks “Accept all” whenever cookies comes up—[Interruption.] Well, some Members are much more assiduous than I am in that regard. But the wise Members of the House know perfectly well that the problem is that it undermines the whole purpose of cookies. We all support tackling it because clicking a new cookie banner every time we load a web page is a waste of everybody’s time and is deeply annoying.

However, the Government’s proposed regulation 6B gives the Secretary of State a blank cheque to make provisions as they see fit, without proper parliamentary scrutiny. That is why we are unhappy with it and have tabled amendment 6, which would remove those powers from the Bill as they are simply not ready to enter the statute book. Yet again I make the point that the Bill repeatedly and regularly gives new powers to the Secretary of State. Sure, they would be implemented by secondary legislation—but as we all know, secondary legislation is unamendable and therefore subject to much less scrutiny. These are areas in which the state is taking significant powers over the public and private individuals.

Let me deal with some of the Labour party’s amendments. First, I take subject access requests. The Government have repeatedly been in the wrong place on those, I am afraid, ever since the introduction of the first iteration of the DPDI Bill under Nadine Dorries, when they tried to charge people for access to their own data. Fortunately, that has now gone the way of Nadine Dorries. [Interruption.] I note that the Minister smiled at that point. We still have concerns about the Government’s plans to change the thresholds for refusing subject access requests from “manifestly unfounded or excessive” to “vexatious or excessive”. The Equality and Human Rights Commission, Reset, the TUC and Which? have all outlined their opposition to the change, which threatens to hollow out what the Government themselves admit is a “critical transparency mechanism”.

We have tabled two simple amendments. Amendment 2 would establish an obligation on any data controller refusing a subject access request to provide evidence of why a request has been considered vexatious or excessive. Organisations should not be allowed to just declare that a request is vexatious or excessive and so ascribe a motive to the data subject in order to refuse to provide their data, perhaps simply because of the inconvenience to the organisation.

The Government will try to tell me that safeguards are in place and that the data subject can make appropriate complaints to the organisation and the ICO if they believe that their request has been wrongly refused. But if we take the provisions set out in clause 9 to extend the time limits on subject access requests, add the advantage for companies of dither and delay when considering procedural complaints, and then add the additional burden on a data subject of having to seek out the ICO and produce evidence and an explanation of their request as well as the alleged misapplication of the vexatious or excessive standard, we see that people could easily be waiting years and years before having the right to access their own data. I cannot believe that, in the end, that is in the interests of good government or that it is really what the Government want.

Despite public opposition to the measures, the Government are also now going further by introducing at this stage amendments that further water down subject access request protections. Government new clauses 7 and 9, which the Minister did not refer to—in fact, he only mentioned, I think, a bare tenth of the amendments he wants us to agree this afternoon—limit a data subject’s entitlement to their own data to the controller’s ability to conduct a “reasonable and proportionate” search. But what is reasonable and proportionate? Who determines what has been a reasonable and proportionate search? The new clauses drive a coach and horses through the rights of people to access their own data and to know who is doing what with their information. That is why Labour does not support the changes.

I come to one of the most important issues for us: high-risk processing, which, as the term suggests, is of most concern when it comes to the rights of individuals. I was pleased but perplexed to see that the Government tabled amendments to new clause 30 that added further clarity about the changed provisions to record keeping for the purposes of high-risk processing. I was pleased because it is right that safeguards should be in place when data processing is deemed to be of high risk, but I was perplexed because the Government do not define high-risk processing in the Bill—in fact, they have removed the existing standard for high-risk processing from existing GDPR, thereby leaving a legislative lacuna for the ICO to fill in. That should not be up to the ICO. I know that the ICO himself thinks that it should not be up to him, but a matter for primary legislation.

Our amendment 1 retains a statutory definition of high-risk processing as recommended by the ICO in his response to the Bill, published in May. He said:

“the detail in Article 35 (3) was a helpful and clear legislative backstop.”

That is why he supports what we are suggesting. Our amendment 4 would also clarify those individual rights even further, by again providing the necessary definition of what constitutes high risk, within the new provisions concerning the responsibilities of senior responsible individuals for data processing set out in clause 15.

I turn to automated decision making, which has the potential to deliver increasingly personalised and efficient services, to increase productivity, and to reduce administrative hurdles. While most of the world is making it harder to make decisions exclusively using ADM, clause 12 in the Bill extends the potential for automated decision making in the UK. Yet countless research projects have shown that automated decision making and machine decision making are not as impartial or blind as they sound. Algorithms can harbour and enhance inbuilt prejudices and injustices. Of course we cannot bury our heads in the sand and pretend that the technology will not be implemented or that we can legislate it out of use; we should be smart about ADM and try to unlock its potential while mitigating its potential dangers. Where people’s livelihoods are at risk or where decisions are going to have a significant impact, it is essential that extra protections are in place allowing individuals to contest decisions and secure human review as a fundamental backstop.

Our amendment 5 strikes a better balance by extending the safeguarding provisions to include significant decisions that are based both partly and solely on automated processing; I am very hopeful that the Government will accept our amendment. That means greater safeguards for anybody subject to an automated decision-making process, however that decision is made. It cannot just be a matter of “the computer says no.”

I think the Minister is slightly surprised that we are concerned about democratic engagement, but I will explain. The Bill introduces several changes to electoral practices under the guise of what the Government call “democratic engagement”, most notably through clauses 86 and 87. The former means that any political party or elected representative could engage in direct marketing relying on a soft opt-in procedure, while clause 87 allows the Secretary of State to make any future exemptions and changes to direct marketing rules for the very unspecified purposes of “democratic engagement”.

The Ada Lovelace Institute and the Internet Advertising Bureau have raised concerns about that, and in Committee Labour asked the Minister what the Government had in mind. He rather gave the game away when he wrote to my hon. Friend the Member for Barnsley East (Stephanie Peacock), to whom I pay tribute for the way she took the Bill through the Committee:

“A future government may want to encourage democratic engagement in the run up to an election by temporarily ‘switching off’ some of the direct marketing rules.”

Switching off the rules ahead of an election—does anyone else smell a rat?

--- Later in debate ---
I say to those in the other place as well as to those on the Front Benches that we have not been able to go through this in detail, but we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right.
Baroness Winterton of Doncaster Portrait Madam Deputy Speaker (Dame Rosie Winterton)
- View Speech - Hansard - -

I have now to announce the result of today’s deferred Division on the Draft Strikes (Minimum Service Levels: NHS Ambulance Services and the NHS Patient Transport Service) Regulations 2023. The Ayes were 297 and the Noes were 166, so the Ayes have it.

[The Division list is published at the end of today’s debates.]

Stephen Timms Portrait Sir Stephen Timms
- View Speech - Hansard - - - Excerpts

I rise to speak specifically to Government new clause 34 and connected Government amendments which, as we have been reminded, give Ministers power to inspect the bank accounts of anyone claiming a social security benefit. I think it has been confirmed that that includes child benefit and the state pension, as well as universal credit and all the others. Extremely wide powers are being given to Ministers.

The Minister told us that the measure is expected to save some half a billion pounds over the next five years. I was pleased that the Minister for Disabled People, Health and Work was present at the start of the debate, although he is not now in his place and the Department for Work and Pensions is not hearing the concerns expressed about this measure. The Minister for Data and Digital Infrastructure told us that the Minister for Disabled People, Health and Work will not be not speaking in the debate, so we will not hear what the DWP thinks about these concerns.

We have also been told—I had not seen this assurance—that these powers will not be used for a few years. If that is correct, I am completely mystified by why this is being done in such a way. If we had a few years to get these powers in place, why did the Government not wait until there was some appropriate draft legislation that could be properly scrutinised, rather than bringing such measures forward now with zero Commons scrutiny and no opportunity for that to occur? There will no doubt be scrutiny in the other place, but surely a measure of this kind ought to undergo scrutiny in this House.

I chair the Work and Pensions Committee and we have received substantial concerns about this measure, including from Citizens Advice. The Child Poverty Action Group said that

“it shouldn’t be that people have fewer rights, including to privacy, than everyone else in the UK simply because they are on benefits.”

I think that sums up what a lot of people feel, although it appears to be the position that the Government are now taking. It is surprising that the Conservative party is bringing forward such a major expansion of state powers to pry into the affairs of private citizens, and particularly doing so in such a way that we are not able to scrutinise what it is planning. As we have been reminded, the state has long had powers where there were grounds for suspecting that benefit fraud had been committed. The proposal in the Bill is for surveillance where there is absolutely no suspicion at all, which is a substantial expansion of the state’s powers to intrude.

Annabel Denham, deputy comment editor at The Daily Telegraph warned in The Spectator of such a measure handing

“authorities the power to snoop on people’s bank accounts.”

I suspect that the views expressed there are more likely to find support on the Conservative Benches than on the Labour Benches, so I am increasingly puzzled by why the Government think this is an appropriate way to act. I wonder whether the fact that there have been such warnings prompted Ministers into rushing through the measure in this deeply unsatisfactory way, without an opportunity for proper scrutiny, because they thought that if there had been parliamentary scrutiny there would be substantial opposition from the Conservative Benches as well as from the Labour Benches. It is difficult to understand otherwise why it is being done in this way.

As we have been reminded, new clause 34 will give the Government the right to inspect the bank account of anyone who claims a state pension, which is all of us. It will give the Government the right to look into the bank account of every single one of us at some point during our lives, without suspecting that we have ever done anything wrong, and without telling us that they are doing it. The Minister said earlier that the powers of the state should be limited to those absolutely necessary, and I have always understood that to be a principle of the Conservative party. Yet on the power in the new clause to look into the bank account of everybody claiming a state pension, he was unable to give us any reason why the Government should do such a thing, or why they would ever need to look into the bank accounts of people—everybody—claiming a state pension. What on earth would the Government need to do that for? The entitlement to the state pension is not based on income, savings or anything like that, so why would the Government ever wish to do that?

If we cannot think of a reason why the Government would want to do that, why are they now taking the power to enable them to do so? I think that all of us would agree, whatever party we are in, that the powers of the state should be limited to those absolutely necessary. The power in the new clause is definitely not absolutely necessary. Indeed, no one has been able to come up with any reason for why it would ever be used.