Data Protection Bill [HL]

Lord Arbuthnot of Edrom Excerpts
2nd reading (Hansard): House of Lords
Tuesday 10th October 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts
Lord Arbuthnot of Edrom Portrait Lord Arbuthnot of Edrom (Con)
- Hansard - -

My Lords, it is a pleasure to follow the noble Lord and listen to his important comments on health data and particularly consent. I thought how brave he was with his data machine. I would worry that my pearls of wisdom would disappear somewhere into the ether, but luckily that did not happen to him.

This is a welcome and necessary Bill. It is not perfect, but I leap to its defence in at least one respect—namely; the absence of the GDPR regulations themselves from the Bill. On the Government’s website, there is a truly helpful document, the Keeling schedule, which sets out how the GDPR intersects with the text of this Bill. After noble Lords have read it a few times, it comes close to being comprehensible.

I will touch on one or two of the imperfections of the Bill that have been drawn to noble Lords’ attention by bodies such as ISACA, techUK, Citibank, Imperial College and others, and I am grateful to them for doing that. I declare my interest as chairman of the Information Assurance Advisory Council and my other interests as in the register. While the Bill has its flaws, I am sure that in Committee and on Report we shall be able to see whether improvements might be made.

The Commission says that the aim of the new rules is to,

“give citizens back control over their personal data, and to simplify the regulatory environment for business”.

The Commission has estimated that this would lead to savings of around €2.3 billion a year for businesses. But while the rules might make things simpler for businesses in that respect, it is possible that they will also make it easier for citizens to demand to know what information is held on them in paper form as well as in digital form. In fact, that is one of the main purposes of the Bill. So we might find that businesses have more rather than less to do. I wonder whether that has been costed. It is a good thing that citizens should find out what information people hold on them, but we should not pretend that the exercise will be free of cost to businesses. The Federation of Small Businesses estimates an additional cost of £75,000 per year for small businesses, and obviously much more for larger ones.

The Bill contains a bespoke regime for the processing of personal data by the police, prosecutors and other criminal justice agencies for law enforcement purposes. The aim of this, which is laudable, is to,

“ensure that there is a single domestic and trans-national regime for the processing of personal data for law enforcement purposes across the whole of the law enforcement sector”,

but what is the law enforcement sector? To what extent do banks, for example, fall into the law enforcement sector? They have obligations under the anti-money laundering rules to pull suspicions together and to share those across borders—not just across European borders but globally. How are those obligations tied in with the GDPR obligations in the Bill? Businesses, especially banks, will need to understand the interplay between the GDPR regulations, the anti-money laundering regulations and all of the others. The Government would not, I know, want to create the smallest risk that by obeying one set of laws you disobey another.

That sort of legal understanding and pulling things together will take time. It will take money and training for all organisations. There is a real concern that too many organisations are simply hoping for the best and thinking that they will muddle through if they behave sensibly. But that is not behaving sensibly. They need to start now if they have not started already. The Federation of Small Businesses says that:

“For almost all smaller firms, the scope of the changes have not even registered on their radar. They simply aren’t aware of what they will need to do”.


Yet it goes on to say that,

“full guidance for businesses will not be available until next year, potentially as late as spring. The regulator cannot issue their guidance until the European Data Protection Board issue theirs”,

so there is a lot of work to be done.

I shall touch on three other issues at this stage of the Bill. The first is Clause 15, which would allow the alteration of the application of the GDPR by regulations subject to affirmative resolution and that could include the amendment or repeal of any of the derogations contained in the Bill. I share the concern expressed by the noble Baroness, Lady Ludford, on that and we will need to look at it.

Secondly, there are various issues around consent. The only one that I will mention is that the Bill provides that the age of consent for children using information society services should be 13. The right reverend Prelate the Bishop of Chelmsford mentioned the YouGov survey about that. I actually believe that the Government have this right. It recognises the reality of today’s social media and the opportunities that the digital world brings, and the Bill also protects young people to some extent by the right to have information deleted at the age of 18. TechUK agrees and so does the Information Commissioner. But if the public do not—and from the sounds of the YouGov survey they do not—there is a lot of work to be done in explaining to people why the age of 13 is the right one.

There is a technical issue that I simply do not understand. The GDPR rules state that the minimum age a Government can set for such consent is 13, and in this Bill, as we know, the Government have gone for the minimum, except in Scotland. Scotland is dealt with in Clause 187 of the Bill and there it seems that the minimum age is 12, unless I have this completely wrong. How do the Government square that with the GDPR minimum of 13?

My final point echoes one raised by the noble Lord, Lord McNally, relating to the issue of the re-identification of personal data which has been de-identified, as set out in Clause 162. The clause makes it a crime to work out to whom the data is referring. The very fact that this clause exists tells us something: namely, that whatever you do online creates some sort of risk. If you think that your data has been anonymised, according to the computational privacy group at Imperial College, you will be wrong. It says:

“We have currently no reason to believe that an efficient enough, yet general, anonymization method will ever exist for high-dimensional data, as all the evidence so far points to the contrary”.


If that is right, and I believe it is, then de-identification does not really exist. And if that is right, what is it in terms of re-identification that we are criminalising under this clause? In a sense, it is an oxymoron which I think needs very careful consideration. The group at Imperial College goes on to suggest that making re-identification a criminal offence would make things worse because those working to anonymise data will feel that they do not have to do a particularly good job. After all, re-identifying it would be a criminal offence, so no one will do it. Unfortunately, in my experience that is not entirely the way the world works.

We can come back to all of these issues in Committee and consider them further, and I look forward to the opportunity of doing so. This is not just a worthwhile Bill; it is an essential and timely one, and I wish it well.