Data Protection Bill [HL] Debate

Full Debate: Read Full Debate
Department: Home Office

Data Protection Bill [HL]

Baroness Hamwee Excerpts
2nd reading (Hansard - continued): House of Lords
Tuesday 10th October 2017

(7 years, 1 month ago)

Lords Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts
Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - -

My Lords, I, too, thank the Minister for his careful introduction of the Bill, and the organisations and individuals who have briefed us, including the individual who wrote, “It does your head in”. I was glad to hear the assurance that the Bill may—I hope I have this right—with repeated readings come close to comprehension.

At later stages, I hope to focus on Parts 3 and 4 of the Bill, but this evening I make some points about young people and the age of consent. I have to say—I may be out of step with other noble Lords—that I am not entirely convinced that the age of 16 would provide more effective protection than 13. I was struck by the recent launch of a report by the Children’s Commissioner for England. The report contains a jargon-busting guide,

“to give kids more power in digital world”.

The commissioner’s launch paper remarked:

“For children, there is no difference between online and offline life. To them, it’s just life … You wouldn’t drop a 12-year-old in the middle of a big city and expect them to fend for themselves. The same should be true online”.


The jargon-busting guide is intended to help children and teachers negotiate and understand what they are signing up to when they use Facebook, Instagram, YouTube, Snapchat, WhatsApp and so on. It uses simplified terms and conditions—it is acknowledged that it is not a legal document but is designed to be an accessible and child-friendly tool to help children understand their digital rights and make informed choices.

Noble Lords will have received a briefing from the Carnegie UK Trust on digital skills. Among other things, it reminds us that so many young people— I think actually that should be “so many people”—are unaware that “delete” does not actually mean “delete”.

I do not think that achieving the age of 14, 15 or 16 would address this. The route of information and education is much more important than a diktat in legislation. I suspect that we could be in danger of being unrealistic about what life is like for children and young people these days. We should not ignore public opinion but, quite honestly, times have changed. We will debate both the age threshold and age verification, which is clearly inseparable from this, during the course of the Bill.

Like other noble Lords, I am concerned about public trust and confidence in the system. At the moment there is a need for guidance on preparation for the new regime. I visited a charity last week and asked about the availability and accessibility of advice. The immediate, almost knee-jerk response was, “It’s pretty dire”—followed by comments that most of what is available is about fundraising and that there is a particular lack of advice on how to deal with data relating to children. The comment was made, too, that the legislation is tougher on charities than on the private sector. I have not pinned down whether that is the case, but I do not disbelieve it. The Federation of Small Businesses has made similar points about support for small businesses.

On confidence and trust, my view is that the use of algorithms undermines confidence. This is not an algorithm but perhaps an analogy: we have been made aware recently—“reminded” would be a better term—of the requirement on banks to check the immigration status of account holders. I took part recently in a panel discussion on immigration. The participants’ names were Gambaccini, Siddiq, Qureshi and Hamwee. With those names, although we are all British citizens, I should think that we are pretty suspect. Algorithms will be used by the policing and intelligence communities, among others. My specific question is: have the Government considered independent oversight of this?

My confidence in the system is also not helped by the fact that the data protection principles applied to law enforcement do not include transparency. I am prepared to be told that this is because of the detail of the GDPR, but I find it difficult to understand why there is not transparency subject to some qualifications, given that transparency is within the principles applying in the case of the intelligence services.

“User notification” is another way of talking about transparency and is a significant human rights issue in the context of the right not only to privacy but to effective remedy and a fair trial. I am sure that we will question some of the exemptions and seek more specificity during the course of the Bill.

We are of course accustomed to greater restrictions—or “protections”, depending on your point of view—where national security is concerned, but that does not mean that no information can be released, even if it is broad brush. I wonder whether there is a role for the Intelligence and Security Committee here—not that I would suggest that that would be a complete answer. Again, this is something we might want to explore.

Part of our job is to ensure that the Bill is as clear as possible. I was interested that the report of the committee of the noble Lord, Lord Jay, referred to “white space” and language. It quoted the Information Commissioner, who noted trigger terms such as “high-risk”, “large scale” and “systematic”. Her evidence was that until the new European Data Protection Board and the courts start interpreting the terms,

“it is not clear what the GDPR will look like in practice”.

I found that some of the language of the Bill raised questions in my mind. For instance—I am not asking for a response now; we can do this by way of an amendment later—the term “legitimate” is used in a couple of clauses. Is that wider than “legal”? What is the difference between “necessary” and “strictly necessary”? I do not think that I have ever come across “strictly necessary” in legislation. There are also judgment calls implicit in many of the provisions, including the “appropriate” level of security and processing that is “unwarranted”. By the by, I am intrigued by the airtime given to exams—and by the use of the term “exams”. Back in the day there would certainly have been an amendment to change it to “examinations”; I am not going to table that one.

Finally, I return to the committee report, which has not had as much attention as the Bill. That is a shame, but I am sure we will come back to it as source material. I noted the observation that, post Brexit, there is a risk that, in the Information Commissioner’s words, the UK could find itself,

“outside, pressing our faces on the glass … without influence”,

and yet having,

“adopted fulsomely the GDPR”.

That image could be applied more widely.

Do the Government accept the committee’s recommendation in paragraph 166 that they should start to address retaining UK influence by,

“seeking to secure a continuing role for the Information Commissioner’s Office on the European Data Protection Board”?

My noble friend Lord McNally referred to running up the down escalator, and his alternatives to the Henry VIII clauses are well worth considering—I hope that that does not sound patronising.

This is one of those Bills that is like a forest in the points of principle that it raises. Some of us, I am afraid, will look closely at a lot of the twigs in that forest.