Debates between Brendan O'Hara and Damian Collins during the 2017-2019 Parliament

Wed 9th May 2018
Data Protection Bill [Lords]
Commons Chamber

3rd reading: House of Commons & Report stage: House of Commons

Data Protection Bill [Lords]

Debate between Brendan O'Hara and Damian Collins
3rd reading: House of Commons & Report stage: House of Commons
Wednesday 9th May 2018

(6 years, 7 months ago)

Commons Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: Consideration of Bill Amendments as at 8 May 2018 - (9 May 2018)
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

The right hon. Gentleman is absolutely right and that throws up two really important points.

The first point is that the Information Commissioner is also currently investigating this, which links to the right hon. Gentleman’s point about where the money comes from and who the data controllers are in these campaigns. Although Facebook is saying that it will in future change its guidelines so that people running political ads must have their identity and location verified, we know that it is very easy for bad actors to fake those things. It would be pretty easy for anyone in the House to set up a Facebook page or account using a dummy email address they have created that is not linked to a real person, but is a fake account. This is not necessarily as robust as it seems, so we need to know who is running these ads and what their motivation is for doing so.

Secondly, the Information Commissioner is also looking at the holding of political data. It is already an offence for people to harvest and collect data about people’s political opinions or to target them using it without their consent, and it is an offence for organisations that are not registered political parties even to hold such data. If political consultancies are scraping data off social media sites such as Facebook, combining it with other data that helps them to target voters and micro-targeting them with messaging during a political campaign or at any time, there is a question as to whether that is legal now, let alone under the protection of GDPR.

As a country and a society, we have been on a journey over the past few months and we now understand much more readily how much data is collected about us, how that data is used and how vulnerable that data can be to bad actors. Many Facebook users would not have understood that Facebook not only keeps information about what they do on Facebook, but gathers evidence about what non-Facebook users do on the internet and about what Facebook users do on other sites around the internet. It cannot even tell us what proportion of internet sites around the world it gathers such data from. Developers who create games and tools that people use on Facebook harvest data about those users, and it is then largely outside the control of Facebook and there is little it can do to monitor what happens to it. It can end up in the hands of a discredited and disgraced company like Cambridge Analytica.

These are serious issues. The Bill goes a long way towards providing the sort of enforcement powers we need to act against the bad actors, but they will not stop and neither will we. No doubt there will be further challenges in the future that will require a response from this House.

Brendan O'Hara Portrait Brendan O'Hara
- Hansard - -

I will be very brief, Madam Deputy Speaker, because we are incredibly tight for time.

There is so much in the Bill that I would like to talk about, such as effective immigration control, delegated powers and collective redress, not to mention the achievement of adequacy, but I will concentrate on amendment 5, which appears in my name and those of my hon. Friend the Member for Cumbernauld, Kilsyth and Kirkintilloch East (Stuart C. McDonald) and the hon. Member for Brighton, Pavilion (Caroline Lucas).

The amendment seeks to provide protection for individuals where automated decision making could have an adverse impact on their fundamental rights. It would require that, where human rights are or could be impacted by automated decisions, ultimately, there will always be a human decision maker at the end of the process. It would instil that vital protection of human rights in respect of the general processing of personal data. We believe strongly that automated decision making without human intervention should be subject to strict limitations to promote fairness, transparency and accountability, and to prevent discrimination. As it stands, the Bill provides insufficient safeguards.

I am talking about decisions that are made without human oversight, but that can have long-term, serious consequences for an individual’s health or financial, employment, residential or legal status. As it stands, the Bill will allow law enforcement agencies to make purely automated decisions. That is fraught with danger and we believe it to be at odds not just with the Data Protection Act 1998, but with article 22 of the GDPR, which gives individuals the right not to be subject to a purely automated decision. We understand that there is provision within the GDPR for states to opt out, but that opt-out does not apply if the data subject’s rights, freedoms or legitimate interests are undermined.

I urge the House to support amendment 5 and to make it explicit in the Bill that, where automated processing that could have long-term consequences for an individual’s health or financial, employment or legal status is carried out, a human being will have to decide whether it is reasonable and appropriate to continue. Not only will that human intervention provide transparency and accountability; it will ensure that the state does not infringe an individual’s fundamental rights and privacy—issues that are often subjective and are beyond the scope of an algorithm. We shall press the amendment to the vote this evening.