Data Protection and Digital Information (No. 2) Bill (Fourth sitting) Debate

Full Debate: Read Full Debate
Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

I rise to speak to my amendment 120. The explanatory notes to the Bill clarify that newly permitted automated decisions will not require the existing legal safeguard of notification, stating only:

“Where appropriate, this may include notifying data subjects after such a decision has been taken”.

Clause 11 would replace article 22 of the GDPR, which regulates AI decision making, with new articles 22A to 22D. According to Connected by Data, it is built on the faulty assumption that the people who are affected by automated decision making are data subjects—identifiable individuals within the data used to make the automated decision. However, now that AI decisions can be based on information about other people, it is becoming increasingly common for algorithms created through training on one set of people to be used to reach conclusions about another set.

A decision can be based on seemingly innocuous information such as someone’s postcode or whether they liked a particular tweet. Where such a decision has an impact on viewing recommendations for an online player, we would probably not be that concerned, but personal data is being used more and more to make decisions that affect whole groups of people rather than identified individuals. We need no reminding of the controversy that ensued when Ofqual used past exam results to grade students during the pandemic.

Another example might be an electricity company getting data from its customers about home energy consumption. Based on that data, it could automatically adjust the time of day at which it offered cheaper tariffs. Everyone who used the electricity company would be affected, whether data about their energy consumption patterns were used to make the decision or not. It is whether an automated decision has a legal or similarly significant effect on an individual that should be relevant to their rights around automated decision making.

Many of the rights and interests of decision subjects are protected through the Equality Act 2010, as the Committee heard in oral evidence last week. What is not covered by other legislation, however, is how data can be used in automated decisions and the rights of decision subjects to be informed about, control and seek redress around automated decisions with a significant effect on them. According to Big Brother Watch:

“This is an unacceptable dilution of a critical safeguard that will not only create uncertainty for organisations seeking to comply, but could lead to vastly expanded ADM operating with unprecedented opacity.”

Amendment 120 would require a data controller to inform a data subject whenever a significant decision about that subject was based solely on automated processing. I am pleased that the hon. Member for Barnsley East has tabled a similar amendment, which I support.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The Government absolutely share hon. Members’ view of the importance of transparency. We agree that individuals who are subject to automated decision making should be made aware of it and should have information about the available safeguards. However, we feel that those requirements are already built into the Bill via article 22C, which will ensure that individuals are provided with information as soon as is practicable after such decisions have been taken. This will need to include relevant information that an individual would require to contest such decisions and seek human review of them.

The reforms that we propose take an outcome-focused approach to ensure that data subjects receive the right information at the right time. The Information Commissioner’s Office will play an important role in elaborating guidance on what that will entail in different circumstances.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Ms Monaghan, do you wish to move amendment 120 formally?

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

I will not move it formally, Mr Hollobone, but I may bring it back on Report.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

I beg to move amendment 76, in clause 11, page 19, line 34, at end insert—

“5A. The Secretary of State may not make regulations under paragraph 5 unless—

(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),

(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and

(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”

This amendment would make the Secretary of State’s ability to amend the safeguards for automated decision-making set out in new Articles 22A to D subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.

--- Later in debate ---
Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Of course, the reports on incidents such as those at Fishmongers’ Hall and the Manchester Arena pointed to a general lack of effective collaboration between security forces and the police. It was not data that was the issue; it was collaboration.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I certainly accept that greater collaboration would have been beneficial as well, but there was a problem with data sharing and that is what the clause is designed to address.

As the hon. Member for Barnsley East will know, law enforcement currently operates under part 3 of the Data Protection Act when processing data for law enforcement purposes. That means that even when they work together, law enforcement and the intelligence services must each undertake separate assessments regarding the same joint-working processing.