All 2 Lord Knight of Weymouth contributions to the Data Protection Act 2018

Read Bill Ministerial Extracts

Tue 10th Oct 2017
Data Protection Bill [HL]
Lords Chamber

2nd reading (Hansard - continued): House of Lords
Mon 6th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 2nd sitting (Hansard): House of Lords

Data Protection Bill [HL] Debate

Full Debate: Read Full Debate
Department: Home Office

Data Protection Bill [HL]

Lord Knight of Weymouth Excerpts
2nd reading (Hansard - continued): House of Lords
Tuesday 10th October 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Data Protection Act 2018 Read Hansard Text Read Debate Ministerial Extracts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, as the economy becomes more digitised, the politics of data become centrally important. As the Minister himself said, data is the fuel of the digital economy, and public policy now needs an agile framework around which to balance the forces at play. We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data. The recent theft of the personal details of 143 million Americans in the hack of Equifax or the unfolding story of abuse of social media in the US elections by Russian agents make the obvious case for data protection.

This Bill attempts to help us tackle some big moral and ethical dilemmas, and we as parliamentarians have a real struggle to be sufficiently informed in a rapidly changing and innovative environment. I welcome the certainty that the Bill gives us in implementing the GDPR in this country in a form that anticipates Brexit and the need to continue to comply with EU data law regardless of membership of the EU in the future.

However, we need e-privacy alongside the GDPR. For example, access to a website being conditional on accepting tracking cookies should be outlawed; we need stricter rules on wi-fi location tracking; browsers should have privacy high by default; and we need to look at extending the protections around personal data to metadata derived from personal data.

But ultimately I believe that the GDPR is an answer to the past. It is a long-overdue response to past and current data practice, but it is a long way from what the Information Commissioner’s briefing describes as,

“one of the final pieces of much needed data protection reform”.

I am grateful to Nicholas Oliver, the founder of people.io, and to Gi Fernando from Freeformers for helping my thinking on these very difficult issues.

The Bill addresses issues of consent, erasure and portability to help protect us as citizens. I shall start with consent. A tougher consent regime is important but how do we make it informed? Even if 13 is the right age for consent, how do we inform that consent with young people, with parents, with adults generally, with vulnerable people and with small businesses which have to comply with this law? Which education campaigns will cut through in a nation where 11 million of us are already digitally excluded and where digital exclusion does not exclude significant amounts of personal data being held about you? And what is the extent of that consent?

As an early adopter of Facebook 10 years ago, I would have blindly agreed to its terms and conditions that required its users to grant it,

“a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content”.

I posted on the site. It effectively required me to give it the right to use my family photos and videos for marketing purposes and to resell them to anybody. Thanks to this Bill, it will be easier for me to ask it to delete that personal data and it will make it easier for me to take it away and put it goodness knows where else with whatever level of security I deem fit, if I can trust it. That is welcome, although I still quite like Facebook, so I will not do it just yet.

But what about the artificial intelligence generated from that data? If, in an outrageous conflagration of issues around fake news and election-fixing by a foreign power to enable a reality TV star with a narcissistic personality disorder to occupy the most powerful executive office in the free world, I take against Facebook, can I withdraw consent for my data to be used to inform artificial intelligences that Facebook can go on to use for profit and for whatever ethical use they see fit? No, I cannot.

What if, say, Google DeepMind got hold of NHS data and its algorithms were used with bias? What if Google gets away with breaking data protection as part of its innovation and maybe starts its own ethics group, marking its own ethics homework? Where is my consent and where do I get a share of the revenue generated by Google selling the intelligence derived in part from my data? And if it sells that AI to a health company which sells a resulting product back to the NHS, how do I ensure that the patients are advantaged because their data was at the source of the product?

No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead. Users can no longer reasonably comprehend the risk associated with data sharing, and so cannot reasonably be asked to give consent.

The individual as a data controller also becomes central. I have plenty of names, addresses, phone numbers and email addresses, and even the birthdays of my contacts in my phone. Some are even Members of your Lordships’ House. If I then, say, hire a car and connect my phone to the car over Bluetooth so that I can have hands-free driving and music from my phone, I may then end up sharing that personal contact data with the car and thereby all subsequent hirers of the car. Perhaps I should be accountable with the car owner for that breach.

Then, thanks to AI, in the future we will also have to resolve the paradox of consent. If AI determines that you have heart disease by facial recognition or by reading your pulse, it starts to make inference outside the context of consent. The AI knows something about you, but how can you give consent for it to tell you when you do not know what it knows? Here, we will probably need to find an intermediary to represent the interests of the individual, not the state or wider society. If the AI determines that you are in love with someone based on text messages, does the AI have the right to tell you or your partner? What if the AI is linked to your virtual assistant—to Siri or Google Now—and your partner asks Siri whether you are in love with someone else? What is the consent regime around that? Clause 13, which deals with a “significant decision”, may help with that, but machine learning means that some of these technologies are effectively a black box where the creators themselves do not even know the potential outcomes.

The final thing I want to say on consent concerns the sensitive area of children. Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting and so on. I am an advocate of the uses of these technologies. Many have seamless integration with the school management information systems that thereby expose children’s personal data to third parties based on digital contracts. Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.

Then there is the collection of data by schools to populate the national pupil database held by the Department for Education. This database contains highly sensitive data about more than 8 million children in England and is routinely shared with academic researchers and other government departments. The justification for this data collection is not made clear by the DfE and causes a big workload problem in schools. Incidentally, this is the same data about pupils that was shared with the Home Office for it to pursue deportation investigations. I am talking about data collected by teachers for learning being used for deportation. Where is the consent in that?

I have here a letter from a Lewisham school advising parents of its privacy policy. It advises parents to go to a government website to get more information about how the DfE stores and uses the data, if they are interested. That site then advises that the Government,

“won’t share your information with any other organisations for marketing, market research or commercial purposes”.

That claim does not survive any scrutiny. For example, Tutor Hunt, a commercial tutoring company, was granted access to the postcode, date of birth and unique school reference number of all pupils. This was granted for two years up to the end of March this year to give parents advice on school choice. Similar data releases have been given to journalists and others. It may be argued that this data is still anonymous, but it is laughable to suggest that identity cannot then be re-engineered, or engineered in the first place, from birth date, postal code and school. The Government need to get their own house in order to comply with the Bill.

That leads me to erasure, which normally means removing all data that relates to an individual, such as name, address and so on. The remaining data survives with a unique numeric token as an identifier. Conflicting legislation will continue to require companies to keep data for accounting purposes. If that includes transactions, there will normally be enough data to re-engineer identity from an identity token number. There is a clause in the Bill to punish that re-engineering, which needs debating to legitimise benign attempts to test research and data security, as discussed by the noble Baroness, Lady Manningham-Buller.

The fact that the Bill acknowledges how easy it is to re-identify from anonymous data points to a problem. The examples of malign hacking from overseas are countless. How do we prevent that with UK law? What are the Government’s plans, especially post Brexit, to address this risk? How do we deal with the risk of a benign UK company collecting data with consent—perhaps Tutor Hunt, which I referred to earlier—that is then acquired by an overseas company, which then uses that data free from the constraints of this legislation?

In the context of erasure, let me come to an end by saying that the Bill also allows for the right to be forgotten for children as they become 18. This is positive, as long as the individual can choose what they want to keep for him or herself. Otherwise, it would be like suggesting you burn your photo albums to stop an employer judging you.

Could the Minister tell me how the right to be forgotten works with the blockchain? These decentralised encrypted trust networks are attractive to those who do not trust big databases for privacy reasons. By design, data is stored in a billion different tokens and synced across countless devices. That data is immutable. Blockchain is heavily used in fintech, and London is a centre for fintech. But the erasure of blockchain data is impossible. How does that work in this Bill?

There is more to be said about portability, law enforcement and the intelligence services, but thinking about this Bill makes my head hurt. Let me close on a final thought. The use of data to fuel our economy is critical. The technology and artificial intelligence it generates have a huge power to enhance us as humans and to do good. That is the utopia we must pursue. Doing nothing heralds a dystopian outcome, but the pace of change is too fast for us legislators, and too complex for most of us to fathom. We therefore need to devise a catch-all for automated or intelligent decisioning by future data systems. Ethical and moral clauses could and should, I argue, be forced into terms of use and privacy policies. That is the only feasible way to ensure that the intelligence resulting from the use of one’s data is not subsequently used against us as individuals or society as a whole. This needs urgent consideration by the Minister.

Data Protection Bill [HL]

Lord Knight of Weymouth Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - -

My Lords, I support the amendments. I remind the House of my interests in relation to my work at TES, the digital education company.

The noble Baroness, Lady Kidron, and the others who have supported the amendment have given the Government a pretty neat way out of the problem that 13 as the age of consent for young people to sign up to “information society services”, as the Bill likes to call them, feels wrong. I have found that for many Members of your Lordships’ House, 16 feels like a safer and more appropriate age, for all the reasons that the noble Lord, Lord Storey, has just given in terms of defining when children are children. There is considerable discomfort about 13 in terms of where the Bill currently sits.

However, I think many noble Lords are realists and understand that to some extent the horse has bolted. Given the huge numbers of young people currently signing up to these services who are under 13, trying to pretend that we can find a way of forcing the age up to 16 from the accepted behavioural norm of 13 looks challenging. Yet we want to protect children. So the question is whether these amendments would provide that solution. That hinges on whether it is reasonable to ask the suppliers of information society services to verify age, and whether it is then reasonable to ask them to design in an age-appropriate fashion. From my experience, the answer to both is yes, it is. Currently, all you do is tick a box to self-verify that you are the age you are. If subsequently you want to have your data deleted, you may have to go through a whole rigmarole to prove that you are who you are and the age you say you are, but for some reason the service providers do not require the same standard of proof and efficacy at the point where you sign up to them. That is out of balance, and it is effectively our role to put it back into balance.

The Government themselves, through the Government Digital Service, have an exceedingly good age-verification service called, strangely, Verify. It does what it says on the tin, and it does it really well. I pay tribute to the GDS for Verify as a service that it allows third parties to use: it is not used solely by Government.

So age verification is undoubtedly available. Next, is it possible—this was explored in previous comments, so I will not go on about it—for age-appropriate design to be delivered? From our work at TES, I am familiar with how you personalise newsfeeds based on data, understanding and profiling of users. It is worth saying, incidentally, that those information society services providers will be able to work out what age their users are from the data that they start to share: they will be able to infer age extremely accurately. So there is no excuse of not knowing how old their users are. Any of us who use any social media services will know that the feeds we get are personalised, because they know who we are and they know enough about us. It is equally possible, alongside the content that is fed, to shift some aspects of design. It would be possible to filter content according to what is appropriate, or to give a slightly different homepage, landing page and subsequent pages, according to age appropriateness.

I put it to the Minister, who I know listens carefully, that this is an elegant solution to his problem, and I hope that he reflects, talks to his colleague the right honourable Matthew Hancock, who is also a reasonable Minister, and comes back with something very similar to the amendments on Report, assuming that they are not pressed at this stage.

Baroness Hollins Portrait Baroness Hollins (CB)
- Hansard - - - Excerpts

My noble friend made a very strong case. The internet was designed for adults, but I think I am right in saying that 25% of time spent online is spent by children. A child is a child, whether online or offline, and we cannot treat a 13 year-old as an adult. It is quite straightforward: the internet needs to be designed for safety. That means it must be age appropriate, and the technology companies need to do something about it. I support the amendments very strongly.

--- Later in debate ---
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

I apologise to the Minister for interrupting. I am just interested in that confusion that he talks about. Perhaps I am incorrect, but I understand that images, for example, are data. There is a lot of concern about sexting and about platforms such as Snapchat and the sharing of data. Where is the confusion? Is it in the Government, or in the Chamber?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I do not think I mentioned confusion. What we are talking about in the Bill is purely data protection. We are talking about the age at which children can consent to information society services handling their data. What I think the noble Baroness, and a lot of Peers in the House, are talking about is keeping children safe online, which is more than just protection of their personal data.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I am happy to confirm those two points. On extraterritoriality, I agree with the noble Baroness that it is difficult to control. Commercial sites are easier—an example of which is gambling. We can control the payments, so if they are commercial and cannot pay people, they may well lose their attractiveness. Of course, the only way to solve this is through international agreement, and the Government are working on that. Part of my point is that, if you drive children away to sites located abroad, there is a risk in that. The big, well-known sites are by and large responsible. They may not do what we want, but they will work with the Government. That is the thrust of our argument. We are working with the well-known companies and, by and large, they act responsibly, even if they do not do exactly what we want. As I say, however, we are working on that. The noble Baroness is right to say that, if we drive children on to less responsible sites based in jurisdictions with less sensible and acceptable regimes, that is a problem.

Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

Could the Minister help me with any information he might have about when the GDPR was drawn up? It must have been envisaged when Article 8 was put together that some member states would go with something different—be it 13, 16, or whatever. The issue of foreign powers must have been thought about, as well as verifying age, parental consent, or the verification of parental identity to verify age. Article 8 just talks about having to have parental sign-off. These issues of verification and going off to foreign powers must have been thought about when the article was being put together in Europe. Does he have any advice on what they thought would be done about this problem?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - - - Excerpts

I cannot give the noble Lord chapter and verse on what the European bureaucrats were thinking when they produced the article, but age verification is not really the issue on this one, because it is extremely difficult to verify ages below 18 anyway. Although one can get a driving licence at 17, it is at the age of 18 when you can have a credit card. As I say, the issue here is not age verification—rather, it is about how, when we make things too onerous, that has the potential to drive people away on to other sites which take their responsibilities less seriously. That was the point I was trying to make.

--- Later in debate ---
We are therefore at an important time. By agreeing this amendment, we can ensure that PSHE will be the vehicle by which these issues can be taught to all children in all schools. I hope that when we come to Report the Minister will be able to report that that will be the case. I beg to move.
Lord Knight of Weymouth Portrait Lord Knight of Weymouth
- Hansard - -

My Lords, does the Minister agree with the noble Lord, Lord Storey, that PSHE would be the most appropriate way to educate young people about data rights? If so, I note that the Secretary of State, Justine Greening, has today announced that Ian Bauckham will lead the review on how relationship and sex education for the 21st century will be delivered. Can the Minister, who is clearly prepared to think about this appointment today, ask whether it is within his scope to think about how data rights education may be delivered as part of that review, and whether the review will draw on the work of the previous person who reviewed the delivery of PSHE, Sir Alasdair Macdonald, the last time Parliament thought that compulsory SRE was a good idea?

Baroness Kidron Portrait Baroness Kidron
- Hansard - - - Excerpts

I support the amendment. I was on the House of Lords Communications Committee, to which the noble Lord just referred. We recommended that digital literacy be given the same status as reading, writing and arithmetic. We set out an argument for a single cross-curricular framework of digital competencies—evidence-based, taught by trained teachers—in all schools whatever their legal status.

At Second Reading, several noble Lords referred to data as the new oil. I have been thinking about it since: I am not so certain. Oil may one day run out; data is infinite. What I think we can agree is that understanding how data is gathered, used and stored, and, most particularly, how it can be harnessed to manipulate both your behaviour and your digital identity, is a core competency for a 21st-century child. While I agree with the noble Lord that the best outcome would be a single, overarching literacy strategy, this amendment would go some small way towards that.