Queen’s Speech Debate

Full Debate: Read Full Debate
Department: Scotland Office
Wednesday 28th June 2017

(7 years, 4 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Baroness O'Neill of Bengarve Portrait Baroness O'Neill of Bengarve (CB)
- Hansard - -

My Lords, I want to turn to another set of knotty issues that will arise in the process of Brexit in the area of privacy and data protection.

For 20 years and more, we have tried in the EU to deal with issues of privacy by taking a data protection approach: that is, to protect privacy by putting obligations on the data controllers of larger institutions in order to regulate the use of data by which persons can be identified. This approach has been taken in many other jurisdictions, although it is not the only approach to privacy protection and probably not the most intuitive.

At present, the UK relies on the EU directive of 1995, as implemented in the Data Protection Act 1998, but the landscape changed recently when the General Data Protection Regulation was agreed by the EU in April last year. The new regulation comes into force in May 2018: that is, before Brexit negotiations can be completed, even on the most optimistic scenarios.

The Government have stated that the UK’s decision will not affect the commencement of the general regulation, but that is not the end of the matter. They have also stated that they intend to bring forward new legislation on data protection that will, among other things, secure that rather beautiful right to be forgotten, at least for youthful indiscretions.

It is of great importance for business, for public bodies and indeed for citizens to know whether the implementation of the general regulation next May is to be followed by yet another change in the legislative framework. Data governance is complex and has to be built into institutional practice in quite detailed ways. It cannot be changed overnight; it is very easy to get things wrong.

There are reasons to think that data protection works less well as a system for protecting privacy than it may have done when the original directive was devised and implemented. As I see it, technological developments have transformed the ways in which and the scale on which data can be organised and interrogated. Twenty years ago, it was perhaps reasonable to assume that the main threat to privacy was the inadvertent or deliberate disclosure of controlled data—the sorts of cases in which some employee inadvertently sends data to the wrong person, or somebody deliberately sends data that were held as private or confidential to a newspaper—or of course to a rival firm or perhaps to a hostile Government. But that was then, and now is different.

Breaches of privacy typically arise now not by disclosure but by inference. This is not new. When we first read detective stories, one thing we enjoy is the way in which the detective infers whodunit by linking different clues and drawing inferences. Today, in the era of big data, inference is hugely powerful. It is possible to infer information about individuals using varied data sources, including datasets that are outwith the control of any data controller—for example, that are in the public domain—and datasets that contain no identifying information. Data protection, however, tries to work entirely by setting requirements on data controllers. But this approach may fail if the data used to breach others’ privacy are not controlled by any data controller.

The general regulation is an improvement on the old directive. It allows inferences to identification by drawing on additional information by,

“means reasonably likely to be used”,

to be the criteria. It may signal some added realism, but I am unsure whether it sets a feasible standard for daily institutional life. It prohibits the further use of personal data unless for compatible purposes. Again, is that feasible in daily institutional life?

I suggest that a difficulty is that the regulation was devised, once again, with an eye only to data that are controlled, but in the real world people draw on information from sources that are not regulated by any data controller. For example, they may draw on data on social media, or from sources that contain no personal data and yet may reach conclusions that violate privacy. Although there are many groups working on these issues, I do not see a solution ready to hand.

If there is to be legislation—and I take it that there is, since the Government have committed themselves to it—can we be sure that they will take a realistic view about the means that can now be used to protect and to breach privacy, means that need to be manageable for the institutions, and to take a wide view of the diversity of ways in which privacy may be breached? Can the Minister undertake that new legislation in this complex area will be subject to exacting scrutiny? Would she be willing to ensure pre-legislative scrutiny of something that is both vital and very complex?