2nd reading (Hansard - continued): House of Lords
Tuesday 10th October 2017

(7 years, 1 month ago)

Lords Chamber
Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts
Second Reading (Continued)
18:52
Lord Storey Portrait Lord Storey (LD)
- Hansard - - - Excerpts

My Lords, I am particularly interested in how the Bill enhances the lives of young people and how in Committee we could add to the opportunities that the Bill provides. The word “protection” is immensely important in this digital age, and young people probably need more protection than at any other time in our recent history. They should have control over their own data.

Like your Lordships, I have been sent a large number of briefings on the Data Protection Bill. I was particularly taken with the joint briefing from the Children’s Society and YoungMinds. As we have heard from the noble Baroness, Lady Lane-Fox, they found that almost three in four children and young people have a social media account before the age of 13. The same survey also revealed that four in 10 young people had experienced online bullying. For young people affected by this form of bullying, the right to have contact removed will be very welcome. I have seen first-hand how young people’s lives can be seriously harmed, and I welcome having a longer debate on this issue in Committee.

I was very taken with the noble Baroness’s comments, although they did not quite match my personal experience. As a head teacher of a large 600-place primary school, I would find children who had been seriously bullied and were in meltdown. When we saw the children and talked to their parents, it turned out that the bullying came from social media. This raises the question: how did children as young as eight years old get signed up to Facebook? By their brothers and sisters. Why did their parents not know about this? This is a very serious problem. I do not know if it is about the long arm of the police, which the noble Baroness, Lady Lane-Fox, suggested was not the way, whether it is about young children knowing their rights, or, as I suspect, whether it is a bit of both, including parental education as well.

In the 1960s a baby named Graham Gaskin was put into care by Liverpool local authority after his mother, a local beauty queen, committed suicide by jumping into the River Mersey. Graham was passed from one institution to another; he was sent to over 20 institutions, including 14 different foster homes, over an 18-year period. He claimed that he suffered neglect, mismanagement and sexual abuse. He tried to understand what had happened to him, the family circumstances and the family connections—his back story, if you like. He was prevented from seeing his social services file but managed somehow to purloin it. In those confidential papers he found out about the secrets of his shocking life in care.

Three remarkable people stand out in the Graham Gaskin story: the local solicitor, Mr Rex Makin, who represented Graham and fought to get justice for him; a local journalist, Mr Ian Craig, who spent months checking and cross-checking the details and wrote a series of devastating articles about what had happened to Graham in the Liverpool Echo; and the chair of the social services committee, Mr Paul Clark, who struggled against the legal system to allow his officers to open up the file and had a fiat, which I am told is a type of injunction, issued against him, preventing him releasing those files. In November 1981, the noble Lord, Lord Alton, then my honourable friend and MP for the Edge Hill constituency in Liverpool, spoke in the Commons about the Graham Gaskin case. He said:

“Graham Gaskin is just another name still locked away in a filing cabinet … I hope that encouragement will be given to local authorities to humanise their services so that the tragedy of Graham Gaskin’s lost youth will never happen again”.—[Official Report, Commons, 6/11/81; col. 284.]


Had the files of Graham Gaskin and thousands of other children been allowed to have been opened, they would have revealed a scandal as shocking as the revelations that have come to light about some of our residential homes and might have prevented the abuse of children that was so prevalent at the time.

We have come a long way since those days, and of course the law allows access to files under the Data Protection Act 1998. Since the noble Lord, Lord Alton, made his comments about humanising social services, we have done that very thing. However, opening the files and making them accessible to young people is very different from the sort of legal problems that, for example, solicitors often face. It is of fundamental importance that everyone has the right to their personal data, and the legislation does not restrict or inhibit that right, but I shall talk about it from a practitioner’s point of view. This issue is beyond my comprehension but I have spent several moments talking to solicitors about it, so the language that I use is not of my immediate understanding but it gives some flavour of how we should have not only the spirit of making these files available but the practicalities as well.

If someone makes a request for data a year after making a previous request, and in the intervening period there has been further activity about the requester by the data controller, it will be argued that the substance of previous requests is being repeated. Is not the substance of any request to obtain the relevant data then held by the data controller? It will be argued that if someone has made a previous request, they will not be able to make a subsequent one. I think I understand that and I hope noble Lords do too.

Terminology needs to be clearly defined, not left open to later judicial interpretation. For example, if a right is to be denied on the basis that complying with it would involve disproportionate effort, there needs to be a definition of “proportionate”. More effort is needed for supplying data to someone who has had a lot of dealings with a data controller, especially government departments and numerous agencies because such are regarded as one data controller. We need to ensure that each separate agency has its own data controller. Will it be argued in the courts that it is manifestly unfounded or excessive for someone with a lot of personal data about them to request it? The current law requires all data controllers, with some minor exceptions, to register with the ICO. If they do not, they are acting unlawfully by processing personal data, and the provisions of the criminal law apply.

When the Bill which became the Data Protection Act 1998 was introduced to Parliament, the drafting instructions to parliamentary counsel were as follows: “We regard it as essential that there be a clear sanction for failure to make a mandatory notification. The obligation to notify is itself a cornerstone of the notification regime, and we wish to place a distinct onus on controllers to take responsibility for ascertaining and discharging their obligations in this respect”. Huge numbers have not done so, with a massive loss to the public purse. The law will not be strengthened by removing the cornerstone of the current law.

The Bill is long and detailed, and the devil, as always, is in the detail. The detail needs most careful scrutiny to ensure that the fundamental rights of the citizen are paramount, not those of officialdom. In any balance concerning the rights of the individual, there should be a presumption that those acting in any official capacity should have the official records disclosed. The balancing exercise introduced in the 1998 Act following the Graham Gaskin case, effectively replicated in the Bill, has not worked in practice, and Parliament can and should give further guidance. I look forward to finding out how we may improve some of these detailed issues for people who find themselves in the same situation as the Graham Gaskins of the 1980s.

19:01
Earl of Listowel Portrait The Earl of Listowel (CB)
- Hansard - - - Excerpts

My Lords, it is a great privilege to follow the noble Lord, with all his experience of providing care and support for children and families. It was very troubling to hear of the case of Graham Gaskin.

I hesitate to speak. I do so because I am very interested in child development and issues of age of consent within the development of children. For instance, I very much oppose the lowering of the age of franchise to 16, which many have argued for, because my understanding and experience is that adolescence is hugely challenging and we should not put additional burdens on young people. Reading a survey in which 81% of adults thought that the age of consent for sharing personal information should be at 16 or 18, with the majority of parents thinking that the age should be 18, I was very concerned and wanted to take part in this debate and learn more.

I was recalling our history with access to the internet and pornography. My recollection was that we did not think about those things from the perspective of children and young people. Thanks to the noble efforts of my noble friend Lady Howe, we are now getting on top of that issue, but a report yesterday pointed to a marked rise in sexual assaults by children on children in the past year. Of course it is speculative to say so, but I would not be at all surprised if access by children to the internet has helped to fuel that rise.

We really need to give these issues deep and considered thought and, looking at the briefings, my sense is that it has not been given to the age of consent. It seems to be the default position because that is what Facebook and the other big companies offer. Even the European Union directive did not seem to involve a deep consultation among parents, children—ensuring that children’s voices could be heard—and experts to determine that the age should be between 13 and 16. I join my noble friend Lady Howe’s request for an urgent consultation by the Government with parents, children—in an effective way—and experts on this issue.

I will try to think through what might be the implications. Please forgive my naiveté, but this might be an opportunity for people to market products to 13 year-olds. My experience and research suggests that where children come from family backgrounds of breakdown or depression, that is reflected in the child’s relationships with other children in school. They can find it difficult to relate to others and become isolated. What do they turn to in those circumstances? The research points to the fact that they will be the children with the most expensive articles of clothes. The most expensive trainers will belong to the children who find it most difficult to make relationships with other children. I suppose that we see the same thing in the adult world: often those who are least sociable spend more money on articles of clothing to compensate for that. One concern might be that marketeers will be particularly effective at reaching out to more vulnerable children and encouraging them to pester their parents to buy more products. There will be more pressure on households to go into debt. In our debate on another Bill at the moment, we are seeing that far too many households are experiencing debt. Perhaps that is not a likely eventuality, but it needs to be explored.

Another eventuality might be political lobbying groups seeking to develop a youth wing to reach out to 13, 14 and 15 year-olds and disseminate information to prepare them to join the party later on. All around the world we see hateful political groups gaining ascendance. That is another risk that we need to take into account: how vulnerable are our young people to such groups?

I should be most grateful if the Minister would make clear what is a child in the Bill. Will he ensure that the Bill is clear that anyone under the age of 18 is a child? On the age of consent, what about children with developmental delays or special educational needs? Obviously, chronological age may not be appropriate, so how does one deal with those children? Finally on verification, how do we know that a child who says he is 13 is really 13 and not several years younger?

I share the concern voiced by many Peers about the age of consent. I was to some extent reassured by my noble friend Lady Lane-Fox but, given the history and concern about access to pornography and the lack of consideration for the impact on children and young people, it is our duty to give the Bill the thorough consideration that it needs. I look forward to the Minister’s response.

19:07
Lord Black of Brentwood Portrait Lord Black of Brentwood (Con)
- Hansard - - - Excerpts

My Lords, it is a privilege to follow the noble Earl, who has brought so much wisdom and passion to the issue of child protection, which is rapidly becoming the leitmotiv of this debate—and rightly so. My comments will be about something slightly different: the impact of the Bill on journalism and the right to freedom of expression. I declare my interest accordingly as executive director of the Telegraph Media Group and draw attention to my other media interests in the register.

I first had the dubious pleasure of becoming involved in the issue of data protection more than 20 years ago, when the EU data protection directive was introduced in 1996. During the passage of the Data Protection Bill which implemented it, my noble friend Lord Wakeham, then chairman of the Press Complaints Commission, set out in his customary cogent fashion why that directive was potentially so grave for press and media freedom. He identified two key issues with the directive, and it is worth repeating what he had to say, because those issues are, if anything, more relevant today than they were then:

“The first is that the directive’s definition of ‘personal data’ is extremely wide, covering virtually any information relating to an individual, including details of political opinion, trade union membership, racial or ethnic origin and philosophical beliefs. The second is that the definition of processing specifically includes, for the first time, the use of material for journalistic purposes; and in turn journalism, of course, relies on the use of all the information covered by the directive. The very real danger in the combination”,


of the two is that it,

“could be used to introduce a regime that would gravely damage the freedom of the press, undermine investigative journalism”.—[Official Report, 2/2/1998; col. 462.]

What became the Data Protection Act 1998 avoided such a dismal fate, and indeed through Section 32 struck an appropriate, clever and enduring balance at the time between the right to privacy and the right to freedom of expression. That was in so many ways down to the guiding hand of Lord Williams of Mostyn, who is still much missed in this House. He went out of his way to consult the industry and respond to its concerns. I remember with affection many meetings with him, not least as he was able to make the issue of data protection amusing, which is no small feat. To pick up on the comments of the noble Baroness, Lady Lane-Fox, I do not know whether he would have been able to make it comprehensible—that may have been a challenge too far. But at the end of the day, he succeeded in ensuring that the legislation balanced the right to privacy with the right to free expression, which he treasured so much. We have heard a bit about that in today’s debate.

This Government have been equally as determined as Gareth Williams was to ensure that freedom of expression is protected and have consulted widely all the interested parties. I am particularly grateful to the DCMS Ministers Karen Bradley and Matt Hancock for their understanding and patience in this area of protection not just for journalism but for literary, artistic and academic activities. Great credit is due to all those who were involved in the long and often deeply tortuous negotiations over the general data protection regulation, who ensured that it makes absolutely clear that member states must provide for exemptions and derogations carried out not only for journalistic purposes, but for the purposes of academic, artistic and literary expression as well. Recital 153 of the GDPR is particularly welcome and important as it explicitly recognises how protections for freedom of expression,

“should apply in particular to the processing of personal data in the audiovisual field and in news archives and press libraries”,

and ought to be reflected in the Bill.

The Government have gone to considerable lengths to consult widely on the UK’s implementation of the exemptions and derogations in the directive and have clearly stated, as I am sure the Minister will reiterate again today, that:

“Processing of personal data by journalists for freedom of expression and to expose wrongdoing is to be safeguarded”.


That is what Part 5 of Schedule 2, relating to the exemptions for freedom of expression and information, alongside other clauses in the Bill, seeks to do.

Such protections are vital for us as citizens, who depend on a free press to hold those in positions of power to account. As importantly, particularly in a post-Brexit world—and we have heard a lot about that world today—proper implementation of the exemptions is essential to the continuation of the UK’s shining role as a world leader in the creative, cultural and communications sphere. For all those reasons, it is imperative that the existing protections in the 1998 Act are not just maintained in this legislation but enhanced, and applied consistently throughout the Bill.

I specifically use the word “enhanced” because, through no fault of the existing legislation, which was extremely well crafted, the defences inherent in Section 32 of the 1998 Act have begun to erode. That is mainly an unintended consequence of the Defamation Act 2013, with the passage of which many noble Lords here today were involved. That legislation, so carefully scrutinised in this House, has done much to stop trivial and vexatious libel claims in the courts, but regrettably some people, who are now no longer able to bring libel proceedings, have begun to stretch the boundaries of other laws to do so. Data protection is fast becoming an alternative remedy for those who wish to blunt investigative journalism or seek to launder a justly bad reputation by removing articles from the online record. That is something that we have heard a bit about today.

One issue that we should consider is whether the carefully sculpted defences set out in the Defamation Act 2013 could somehow be replicated in this legislation and applied to data protection claims. It also cannot be right for the Information Commissioner to have the power, set out in Clause 165, to fund legal claims against those pursuing literary, artistic, academic and journalistic activities; that power runs counter to the aims of the Defamation Act. No other sector of activity is singled out in that way, and there is no case for it.

Inevitably as the Bill is scrutinised, much of the devil will be in the detail, as the noble Lord, Lord Storey, said. A number of specific issues—many of them, I suspect, inadvertent or unintended—ought to be addressed if the Bill is not to have a restrictive and damaging impact on freedom of expression, and particularly on the media’s operations, all the way from the initial investigation of a story to the eventual archiving of material. For example, we need to ensure that the investigation and enforcement powers of the Information Commissioner, particularly in the area of pre-publication activities, are not extended, and that the existing checks and balances, which have worked extraordinarily well in the current regime, are rigorously maintained in this legislation, not reduced. If not, there is a danger that the commissioner could become some form of statutory press regulator, which is not what I believe the Government intend, and which most of us would believe to be abhorrent in a free society. Similarly, there needs to be explicit protection for academic, literary and media archives, including a transparent and effective regime for the assessment of “right to be forgotten” requests relating to internet search records. Those records are not just the “first draft of history”; they often now comprise the only record of significant events, which will be essential to historians and others in future, and they must be protected.

We also need to remember that, far more so even than was the case back in 1996, the media today, as with all artistic activities, are completely global. All those processing data for special purposes need to be able to receive and share certain personal data rights across the world. That is particularly true in relation to the protection of sources, and contact or email exchanges with them. We should never forget in this House that in some parts of the world, even partial release of sensitive information can have the most appalling repercussions, putting the lives of sources and reporters in grave, often mortal, danger. The protections and exemptions in this area need to be put in place and be absolutely watertight. Quite apart from the personal risks involved, investigative journalism such as that on the Panama papers could become quite impossible if we did not get this balance right.

I am conscious that I have been talking specifically about Article 10 rights on freedom of expression, but I absolutely understand that those have carefully to be balanced with other rights. My noble friend the Minister in his opening remarks made that point extremely well. It is important to underline that none of the points that I have raised here would in any way undermine an individual’s right to privacy, safeguarded by Article 8 of the convention. These limited changes would continue fully to protect that right, while providing much greater clarity and certainty for those processing data for the special purposes. Therein lies the effective balance which characterised the 1998 Act and which should, I believe, be the guiding principle and hallmark of what will inevitably become the Data Protection Act 2018.

I spoke earlier about the Government’s commitment to consultation on the detail of this Bill, and the constructive and open way in which they have worked with all those impacted in this area. I very much hope that the Minister will continue to undertake such work with all those who have an interest in this vital issue and that we can, during the passage of the Bill, make further amendments to protect what at the end of the day is the foundation stone of our democracy.

19:18
Baroness Hamwee Portrait Baroness Hamwee (LD)
- Hansard - - - Excerpts

My Lords, I, too, thank the Minister for his careful introduction of the Bill, and the organisations and individuals who have briefed us, including the individual who wrote, “It does your head in”. I was glad to hear the assurance that the Bill may—I hope I have this right—with repeated readings come close to comprehension.

At later stages, I hope to focus on Parts 3 and 4 of the Bill, but this evening I make some points about young people and the age of consent. I have to say—I may be out of step with other noble Lords—that I am not entirely convinced that the age of 16 would provide more effective protection than 13. I was struck by the recent launch of a report by the Children’s Commissioner for England. The report contains a jargon-busting guide,

“to give kids more power in digital world”.

The commissioner’s launch paper remarked:

“For children, there is no difference between online and offline life. To them, it’s just life … You wouldn’t drop a 12-year-old in the middle of a big city and expect them to fend for themselves. The same should be true online”.


The jargon-busting guide is intended to help children and teachers negotiate and understand what they are signing up to when they use Facebook, Instagram, YouTube, Snapchat, WhatsApp and so on. It uses simplified terms and conditions—it is acknowledged that it is not a legal document but is designed to be an accessible and child-friendly tool to help children understand their digital rights and make informed choices.

Noble Lords will have received a briefing from the Carnegie UK Trust on digital skills. Among other things, it reminds us that so many young people— I think actually that should be “so many people”—are unaware that “delete” does not actually mean “delete”.

I do not think that achieving the age of 14, 15 or 16 would address this. The route of information and education is much more important than a diktat in legislation. I suspect that we could be in danger of being unrealistic about what life is like for children and young people these days. We should not ignore public opinion but, quite honestly, times have changed. We will debate both the age threshold and age verification, which is clearly inseparable from this, during the course of the Bill.

Like other noble Lords, I am concerned about public trust and confidence in the system. At the moment there is a need for guidance on preparation for the new regime. I visited a charity last week and asked about the availability and accessibility of advice. The immediate, almost knee-jerk response was, “It’s pretty dire”—followed by comments that most of what is available is about fundraising and that there is a particular lack of advice on how to deal with data relating to children. The comment was made, too, that the legislation is tougher on charities than on the private sector. I have not pinned down whether that is the case, but I do not disbelieve it. The Federation of Small Businesses has made similar points about support for small businesses.

On confidence and trust, my view is that the use of algorithms undermines confidence. This is not an algorithm but perhaps an analogy: we have been made aware recently—“reminded” would be a better term—of the requirement on banks to check the immigration status of account holders. I took part recently in a panel discussion on immigration. The participants’ names were Gambaccini, Siddiq, Qureshi and Hamwee. With those names, although we are all British citizens, I should think that we are pretty suspect. Algorithms will be used by the policing and intelligence communities, among others. My specific question is: have the Government considered independent oversight of this?

My confidence in the system is also not helped by the fact that the data protection principles applied to law enforcement do not include transparency. I am prepared to be told that this is because of the detail of the GDPR, but I find it difficult to understand why there is not transparency subject to some qualifications, given that transparency is within the principles applying in the case of the intelligence services.

“User notification” is another way of talking about transparency and is a significant human rights issue in the context of the right not only to privacy but to effective remedy and a fair trial. I am sure that we will question some of the exemptions and seek more specificity during the course of the Bill.

We are of course accustomed to greater restrictions—or “protections”, depending on your point of view—where national security is concerned, but that does not mean that no information can be released, even if it is broad brush. I wonder whether there is a role for the Intelligence and Security Committee here—not that I would suggest that that would be a complete answer. Again, this is something we might want to explore.

Part of our job is to ensure that the Bill is as clear as possible. I was interested that the report of the committee of the noble Lord, Lord Jay, referred to “white space” and language. It quoted the Information Commissioner, who noted trigger terms such as “high-risk”, “large scale” and “systematic”. Her evidence was that until the new European Data Protection Board and the courts start interpreting the terms,

“it is not clear what the GDPR will look like in practice”.

I found that some of the language of the Bill raised questions in my mind. For instance—I am not asking for a response now; we can do this by way of an amendment later—the term “legitimate” is used in a couple of clauses. Is that wider than “legal”? What is the difference between “necessary” and “strictly necessary”? I do not think that I have ever come across “strictly necessary” in legislation. There are also judgment calls implicit in many of the provisions, including the “appropriate” level of security and processing that is “unwarranted”. By the by, I am intrigued by the airtime given to exams—and by the use of the term “exams”. Back in the day there would certainly have been an amendment to change it to “examinations”; I am not going to table that one.

Finally, I return to the committee report, which has not had as much attention as the Bill. That is a shame, but I am sure we will come back to it as source material. I noted the observation that, post Brexit, there is a risk that, in the Information Commissioner’s words, the UK could find itself,

“outside, pressing our faces on the glass … without influence”,

and yet having,

“adopted fulsomely the GDPR”.

That image could be applied more widely.

Do the Government accept the committee’s recommendation in paragraph 166 that they should start to address retaining UK influence by,

“seeking to secure a continuing role for the Information Commissioner’s Office on the European Data Protection Board”?

My noble friend Lord McNally referred to running up the down escalator, and his alternatives to the Henry VIII clauses are well worth considering—I hope that that does not sound patronising.

This is one of those Bills that is like a forest in the points of principle that it raises. Some of us, I am afraid, will look closely at a lot of the twigs in that forest.

19:29
Baroness Manningham-Buller Portrait Baroness Manningham-Buller (CB)
- Hansard - - - Excerpts

My Lords, I will be brief, as the late Lord Walton always said at the start of his speeches. However, I actually mean it. That is because many of the points I want to make have been made by either the noble Baronesses, Lady Neville-Jones or Lady Ludford, or my noble friend Lord Patel, who declared my interest as chair of the Wellcome Trust for me. For those noble Lords who are not familiar with the organisation, we spend about £1 billion a year on improving human health, largely through funding medical research, primarily in this country but also in 16 other countries overseas. We welcome the Bill, although we think it needs improvement. Before Committee, we look for answers to the questions laid out by my noble friend Lord Patel on the need for universities to have real clarity about how they process data.

For the public interest, terminology should be extended so that we can look at issues of safeguards beyond consent and make sure that it is possible to do clinical trials and interventional work. Why is that the case? It is because health data offers the most exciting opportunities to do things which we have only recently been able to do, understand the causes of disease in detail over populations and have a much better chance of getting to diagnosis early. We could deal with many things if we could only diagnose them far earlier and develop treatments for them—indeed, prevent some of them ever materialising. Health data also helps us to measure the efficacy of treatment. We all know of plenty of treatments that over years have proved to be useless, or unexpected ones that have proved to be outstanding. Looking at big-scale data helps us to do that. That data helps in precision medicine, which we are all moving towards having, where the drugs we receive are for us, not our neighbour, although we apparently both have the same illness. Health data can also help with safety as you can collect the side-effects that people are suffering from for particular drugs. It helps us evaluate policy and, of course, should help the NHS in planning.

I know that the Government want to support scientists to process data with confidence and safety. The industrial strategy comments that data should be “appropriately accessed by researchers”. “Appropriate” is a hopeless word; we do not know what it means, but still. The document also states that access for researchers to,

“currently available national datasets should be accelerated by streamlining legal and ethical approvals”.

We are not there yet.

I want to say a word about public support. The Wellcome Trust commissioned an Ipsos MORI poll last year before the Caldicott review to assess public support for the collection of data. In many cases, there is significant public support for that provided it is anonymised—although I know there are questions about that—but what people are fussed about is that their data is sold on for commercial purposes, that it is used for marketing or, worst of all, that it is used to affect their insurance policies and life insurance. Therefore, we need to give reassurance on that. However, it has certainly been the case in our experience, and that of many universities, that you can recruit many people for trials and studies if they believe that their data will help others with similar diseases or indeed themselves.

My noble friend Lord Patel trailed that I would mention the UK Biobank, as this will face real problems if this legislation is not amended. For noble Lords who are not aware of it, the UK Biobank is funded partly by the Wellcome Trust and partly by the Government through the Medical Research Council. Between 2006 and 2010, it recruited half a million people who gave body samples, details about their lifestyles, economic environments and genomes. Some of these details have been accessed but not all. This has produced the most fantastic amount of data, which is helping us to discover causes of cancer, heart disease—there is a long list, and I will read them all out as they are all important—stroke, diabetes, arthritis, osteoporosis, eye disorders, depression and dementia. Other subjects will be added. The conclusions of this data are open to anybody in the world because health has no frontier. There is no other biobank like this in the world. The Chinese have started one called the Kadoorie, but it is neither as extensive nor profound; it will become invaluable, but it is not yet. The UK Biobank is a unique resource for the world. It is based in Oxford and funded by a major British charity and the taxpayer. We must make that data useful and do nothing to damage the way in which it contributes to helping save lives.

19:35
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I have enjoyed the debate very much so far. I hope that the same can be said of my noble friend the Minister, who will clearly find support from all around the House for a large number of amendments. I found myself agreeing with the noble Lord, Lord Stevenson, on several points, not least on the question of adequacy, which seems to me absolutely fundamental to getting this Bill right. I hope that my noble friend will be able to be very clear on how the Government intend to tackle this key aspect.

I agreed with the noble Lord, Lord McNally, too, and his worries about standing up to the tech giants. They are not our friends. They are big, powerful companies that are not citizens of this country. They pay as little tax here as possible and several of them actively help tax evaders in order that they can make more profits out of the transactions that that involves. They control what we see on the internet through algorithms and extract vast quantities of data and know more about us than we know ourselves. In the interests of democracy we really must stand up to them and say, “No, we are the people who matter. It is great you are doing well, but we are the people who matter”. Bills like this are part of that, and it is important that we stand up for ourselves and our citizens.

I agreed very much with my noble friend Lady Neville-Jones that research is crucial. In my context as editor of the Good Schools Guide we use a fair bit of government data and do research with it. I will pick my noble friend’s brain afterwards on what her worries are about the use of data by non-standard researchers because I certainly qualify as that.

My noble friend Lord Arbuthnot referred to a Keeling schedule. It would be wonderful to receive it. For some reason I cannot pick it up on the email. It is not in the documents listed on the Parliament website, not in any location, and it does not Google or come up on GOV.UK. One way or another, I think the simplest thing to ask is: please can we put it on the parliamentary website in the list of documents related to the Bill? I know that it exists, but I just cannot find it. It would be nice if it appeared on the departmental website too.

It seems to me that bits are missing in a number of areas. Where are Articles 3, 27, 22(2)(b) and 35(4) to 35(6)? Where is Article 80(2), as the noble Baroness, Lady Lane-Fox, mentioned? That is an absolutely crucial article. Why has it gone missing? How exactly is recital 71 implemented? I cannot see how the protections for children in that recital are picked up in the Bill. There are a lot of things that Keeling schedules are important for. In a detailed Bill like this, they help us to understand how the underlying European legislation will be reflected, which will be crucial for the acceptance of this Bill by the European Union—I pick up the point made by the noble Lord, Lord Stevenson—and what bits are missing.

And what has been added? Where does paragraph 8 of Schedule 11 come from? It is a very large, loose power. Where are its edges? What is an example of that? I would be very grateful if my noble friend could drop me a note on that before we reach Committee. What is an arguable point under that provision? Where are the limits of our economic interest so far as its influence on this Bill is concerned?

Paragraph 4 of Schedule 10 is another place that worries me. We all make our personal data public, but a lot of the time we do it in a particular context. If I take a photograph with my parliamentary-supplied iPhone, on which there is an app that I have granted the power to look at my photographs for some purpose that I use that app for, I have made that photograph and all the metadata public. That is not what I intended; I made it public for a particular purpose in a particular context—that of social media. A lot of people use things like dating websites. They do not put information on there which is intended to be totally public. Therefore, the wording of paragraph 4 of Schedule 10 seems to be far too wide in the context of the way people use the internet. Principle 2 of the Data Protection Act covers this. It gives us protection against the use of information for purposes which it clearly has not been released for. There does not appear to be any equivalent in the Bill—although I have not picked up the Keeling schedule, so perhaps it is there. However, I would like to know where it is.

On other little bits and pieces, I would like to see the public policy documents under Clause 33(4) and Clause 33(5) made public; at the moment they are not. How is age verification supposed to work? Does it involve the release of data by parents to prove that the child is the necessary age to permit the child access, and if so, what happens to that data? Paragraph 23 of Schedule 2 addresses exam scripts. Why are these suddenly being made things that you cannot retrieve? What are the Government up to here? Paragraph 4 of Schedule 2, on immigration, takes away rights immigrants have at the moment under the Data Protection Act. Why? What is going on?

There are lots of bits and pieces which I hope we can pick up in Committee. I look forward to going through the Bill with a very fine-toothed comb—it is an important piece of legislation.

19:42
Lord Janvrin Portrait Lord Janvrin (CB)
- Hansard - - - Excerpts

My Lords, I welcome the opportunity to speak in this Second Reading debate. It is always slightly daunting to follow the noble Lord, Lord Lucas. We were colleagues on the Digital Skills Committee a few years back, and he was pretty daunting on that too, being a great fund of knowledge on this subject. I mention at the outset my interests as set out in the register, including as a trustee of the British Library and as a member of the parliamentary Intelligence and Security Committee in the last Parliament. I too welcome this important piece of legislation. I will be brief and confine myself to some general remarks.

There is no doubt that data, big data, data processing and data innovation are all absolutely essential ingredients in the digital revolution which is changing the world around us. However, as we have discussed in debates in this House, advances in technology inevitably risk outstripping our capacity to think through some of the social, ethical and regulatory challenges posed by these advances. This is probably true of questions of data protection.

The last key legislation, the Data Protection Act 1998, was ground-breaking in its time. But it was designed in a different age, when the internet was in its infancy, smartphones did not exist and the digital universe was microscopic compared to today. As the Government have said, we desperately need a regulatory framework which is comprehensive and fit for purpose for the present digital age.

As has been mentioned by other noble Lords, the Bill is also necessary to ensure that our legislation is compatible with the GDPR, which comes into force next year. It is absolutely clear that however Brexit unfolds, our ability to retain an accepted common regulatory framework for handling data is essential; the ability to move data across borders is central to our trading future. I was much struck by the lucid explanation given by the noble Lord, Lord Jay, of some of the challenges which lie ahead in achieving this goal of a common regulatory framework for the future.

The Bill before us is undoubtedly a major advance on our earlier legislation. It is inevitably complex, and as today’s debate makes absolutely clear, there are areas which this House will wish to scrutinise carefully and in depth, including issues of consent and the new rights such as the right to be forgotten and to know when personal data has been hacked, and so on. The two areas which will be of particular interest to me as a member of the board of the British Library and as a member of the Intelligence and Security Committee in the last Parliament will be, first and foremost, archiving in the public interest, and secondly, Part 4, on data processing by the intelligence services.

In order to support archiving activities, as was made clear in the British Library’s submission during the DCMS consultation earlier this year, it is essential that this legislation provide a strong and robust legal basis to support public and private organisations which are undertaking archiving in the public interest. As I understand it, this new legislation confirms the exemptions currently available in the UK Data Protection Act 1998: safeguarding data processing necessary for archiving purposes in the public interest and archiving for scientific, historical and statistical purposes. This is welcome, but there may perhaps be issues around definitions of who and what is covered by the phrase “archiving in the public interest”. I look forward to further discussion and, hopefully, further reassurances on whether the work of public archiving institutions such as our libraries and museums is adequately safeguarded in the Bill.

On Part 4, data processing by the intelligence services does not fall within scope of the GDPR, and this part of the Bill provides a regime based on the Council of Europe’s modernised—but not yet finally agreed—Convention 108. The intelligence services already comply with data-handling obligations within the regulatory structures found in a range of existing legislation. This includes the Investigatory Powers Act 2016, which, as was debated in this Chamber this time last year, creates a number of new offences if agencies wrongly disclose data using the powers in that Act.

The new Bill seeks to replicate the approach of the Data Protection Act 1998, whereby there have been well-established exemptions to safeguard national security. It is obviously vital that the intelligence services be able to continue to operate effectively at home and with our European and other partners, and I look forward to our further discussion during the passage of the Bill on whether this draft legislation gives the intelligence services the safeguards they require to operate effectively.

In sum, this is a most important piece of legislation. If, as the noble Baroness, Lady Lane-Fox, suggests, we can set the bar high, it will be a most significant step forward. First, it will redefine the crucial balance between, on the one hand, the freedom to grasp the extraordinary opportunities offered by the new data world we are in and, on the other, the need to protect sensitive personal data. Secondly, and very importantly, it will put the United Kingdom at the forefront of wider efforts to regulate sensibly and pragmatically the digital revolution which is changing the way we run our lives.

19:50
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- Hansard - - - Excerpts

My Lords, as the economy becomes more digitised, the politics of data become centrally important. As the Minister himself said, data is the fuel of the digital economy, and public policy now needs an agile framework around which to balance the forces at play. We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data. The recent theft of the personal details of 143 million Americans in the hack of Equifax or the unfolding story of abuse of social media in the US elections by Russian agents make the obvious case for data protection.

This Bill attempts to help us tackle some big moral and ethical dilemmas, and we as parliamentarians have a real struggle to be sufficiently informed in a rapidly changing and innovative environment. I welcome the certainty that the Bill gives us in implementing the GDPR in this country in a form that anticipates Brexit and the need to continue to comply with EU data law regardless of membership of the EU in the future.

However, we need e-privacy alongside the GDPR. For example, access to a website being conditional on accepting tracking cookies should be outlawed; we need stricter rules on wi-fi location tracking; browsers should have privacy high by default; and we need to look at extending the protections around personal data to metadata derived from personal data.

But ultimately I believe that the GDPR is an answer to the past. It is a long-overdue response to past and current data practice, but it is a long way from what the Information Commissioner’s briefing describes as,

“one of the final pieces of much needed data protection reform”.

I am grateful to Nicholas Oliver, the founder of people.io, and to Gi Fernando from Freeformers for helping my thinking on these very difficult issues.

The Bill addresses issues of consent, erasure and portability to help protect us as citizens. I shall start with consent. A tougher consent regime is important but how do we make it informed? Even if 13 is the right age for consent, how do we inform that consent with young people, with parents, with adults generally, with vulnerable people and with small businesses which have to comply with this law? Which education campaigns will cut through in a nation where 11 million of us are already digitally excluded and where digital exclusion does not exclude significant amounts of personal data being held about you? And what is the extent of that consent?

As an early adopter of Facebook 10 years ago, I would have blindly agreed to its terms and conditions that required its users to grant it,

“a non-exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content”.

I posted on the site. It effectively required me to give it the right to use my family photos and videos for marketing purposes and to resell them to anybody. Thanks to this Bill, it will be easier for me to ask it to delete that personal data and it will make it easier for me to take it away and put it goodness knows where else with whatever level of security I deem fit, if I can trust it. That is welcome, although I still quite like Facebook, so I will not do it just yet.

But what about the artificial intelligence generated from that data? If, in an outrageous conflagration of issues around fake news and election-fixing by a foreign power to enable a reality TV star with a narcissistic personality disorder to occupy the most powerful executive office in the free world, I take against Facebook, can I withdraw consent for my data to be used to inform artificial intelligences that Facebook can go on to use for profit and for whatever ethical use they see fit? No, I cannot.

What if, say, Google DeepMind got hold of NHS data and its algorithms were used with bias? What if Google gets away with breaking data protection as part of its innovation and maybe starts its own ethics group, marking its own ethics homework? Where is my consent and where do I get a share of the revenue generated by Google selling the intelligence derived in part from my data? And if it sells that AI to a health company which sells a resulting product back to the NHS, how do I ensure that the patients are advantaged because their data was at the source of the product?

No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead. Users can no longer reasonably comprehend the risk associated with data sharing, and so cannot reasonably be asked to give consent.

The individual as a data controller also becomes central. I have plenty of names, addresses, phone numbers and email addresses, and even the birthdays of my contacts in my phone. Some are even Members of your Lordships’ House. If I then, say, hire a car and connect my phone to the car over Bluetooth so that I can have hands-free driving and music from my phone, I may then end up sharing that personal contact data with the car and thereby all subsequent hirers of the car. Perhaps I should be accountable with the car owner for that breach.

Then, thanks to AI, in the future we will also have to resolve the paradox of consent. If AI determines that you have heart disease by facial recognition or by reading your pulse, it starts to make inference outside the context of consent. The AI knows something about you, but how can you give consent for it to tell you when you do not know what it knows? Here, we will probably need to find an intermediary to represent the interests of the individual, not the state or wider society. If the AI determines that you are in love with someone based on text messages, does the AI have the right to tell you or your partner? What if the AI is linked to your virtual assistant—to Siri or Google Now—and your partner asks Siri whether you are in love with someone else? What is the consent regime around that? Clause 13, which deals with a “significant decision”, may help with that, but machine learning means that some of these technologies are effectively a black box where the creators themselves do not even know the potential outcomes.

The final thing I want to say on consent concerns the sensitive area of children. Schools routinely use commercial apps for things such as recording behaviour, profiling children, cashless payments, reporting and so on. I am an advocate of the uses of these technologies. Many have seamless integration with the school management information systems that thereby expose children’s personal data to third parties based on digital contracts. Schools desperately need advice on GDPR compliance to allow them to comply with this Bill when it becomes law.

Then there is the collection of data by schools to populate the national pupil database held by the Department for Education. This database contains highly sensitive data about more than 8 million children in England and is routinely shared with academic researchers and other government departments. The justification for this data collection is not made clear by the DfE and causes a big workload problem in schools. Incidentally, this is the same data about pupils that was shared with the Home Office for it to pursue deportation investigations. I am talking about data collected by teachers for learning being used for deportation. Where is the consent in that?

I have here a letter from a Lewisham school advising parents of its privacy policy. It advises parents to go to a government website to get more information about how the DfE stores and uses the data, if they are interested. That site then advises that the Government,

“won’t share your information with any other organisations for marketing, market research or commercial purposes”.

That claim does not survive any scrutiny. For example, Tutor Hunt, a commercial tutoring company, was granted access to the postcode, date of birth and unique school reference number of all pupils. This was granted for two years up to the end of March this year to give parents advice on school choice. Similar data releases have been given to journalists and others. It may be argued that this data is still anonymous, but it is laughable to suggest that identity cannot then be re-engineered, or engineered in the first place, from birth date, postal code and school. The Government need to get their own house in order to comply with the Bill.

That leads me to erasure, which normally means removing all data that relates to an individual, such as name, address and so on. The remaining data survives with a unique numeric token as an identifier. Conflicting legislation will continue to require companies to keep data for accounting purposes. If that includes transactions, there will normally be enough data to re-engineer identity from an identity token number. There is a clause in the Bill to punish that re-engineering, which needs debating to legitimise benign attempts to test research and data security, as discussed by the noble Baroness, Lady Manningham-Buller.

The fact that the Bill acknowledges how easy it is to re-identify from anonymous data points to a problem. The examples of malign hacking from overseas are countless. How do we prevent that with UK law? What are the Government’s plans, especially post Brexit, to address this risk? How do we deal with the risk of a benign UK company collecting data with consent—perhaps Tutor Hunt, which I referred to earlier—that is then acquired by an overseas company, which then uses that data free from the constraints of this legislation?

In the context of erasure, let me come to an end by saying that the Bill also allows for the right to be forgotten for children as they become 18. This is positive, as long as the individual can choose what they want to keep for him or herself. Otherwise, it would be like suggesting you burn your photo albums to stop an employer judging you.

Could the Minister tell me how the right to be forgotten works with the blockchain? These decentralised encrypted trust networks are attractive to those who do not trust big databases for privacy reasons. By design, data is stored in a billion different tokens and synced across countless devices. That data is immutable. Blockchain is heavily used in fintech, and London is a centre for fintech. But the erasure of blockchain data is impossible. How does that work in this Bill?

There is more to be said about portability, law enforcement and the intelligence services, but thinking about this Bill makes my head hurt. Let me close on a final thought. The use of data to fuel our economy is critical. The technology and artificial intelligence it generates have a huge power to enhance us as humans and to do good. That is the utopia we must pursue. Doing nothing heralds a dystopian outcome, but the pace of change is too fast for us legislators, and too complex for most of us to fathom. We therefore need to devise a catch-all for automated or intelligent decisioning by future data systems. Ethical and moral clauses could and should, I argue, be forced into terms of use and privacy policies. That is the only feasible way to ensure that the intelligence resulting from the use of one’s data is not subsequently used against us as individuals or society as a whole. This needs urgent consideration by the Minister.

20:03
Baroness Kidron Portrait Baroness Kidron (CB)
- Hansard - - - Excerpts

My Lords, many noble Lords will know that my particular interests, clearly stated on the register, are concerned with making the digital world fit for children and young people, and so the greater part of my comments concern that. However, I wanted to say at the outset that dealing with this Bill without having had the opportunity to scrutinise the GDPR or understand the ambition and scope of the Government’s digital charter, their internet safety strategy or even some of the details that we still await on the Digital Economy Act made my head hurt also.

I start with the age of consent. Like others, I am concerned that the age of 13 was a decision reached not on the advice of child development experts, child campaigners or parents. Perhaps most importantly of all, the decision lacks the voice of young people. They are key players in this: the early adopters of emerging technologies, the first to spot its problems and, so very often, the last to be consulted or, indeed, not consulted at all. Also, like others, I was bewildered when I saw Clause 187. Are Scottish children especially mature or are their southern counterparts universally less so? More importantly, it seems that we have to comply with the GDPR, except when we do not.

As the right reverend Prelate has outlined, the age of 13 is really an age of convenience. We have simply chosen to align UK standards with COPPA, a piece of US legislation that its own authors once described to me as a “terrible compromise”, and which dates from 2000, when the notion of every child carrying a smartphone with the processing power of “Apollo 11” and consulting it every few minutes, hundreds of times day and night, was not even in our imagination, let alone our reality.

Before considering whether 13 is the right age, we should understand what plans the Government have to require tech companies to make any provisions for those aged 13 to 17, or whether it is the considered opinion of the UK Government that in the digital environment a 13 year-old is a de facto adult. Will the Government require tech companies to publish data risk assessments setting out how children are likely to engage with their service at different ages and the steps they have taken to support them, including transparent reporting data? Are we to have minimum design standards in parts of the digital environment that children frequent, and that includes those places that they are not supposed to be? Will the ICO have powers to enforce against ISS providers which do not take steps to prevent very young children accessing services designed for people twice their age? My understanding is that age compliance will continue to be monitored and enforced by the ISS companies themselves.

As Ofcom pointed out, in 2016 in the UK, 21% of 10 year-olds, 43% of 11 year-olds and half of all 12 year-olds had a social media profile, in spite of COPPA. Are the Government planning to adequately resource and train all front-line workers with children, teachers, parents and children in a programme of digital literacy as the House of Lords Communications Committee called for, and in doing so inform all concerned—those 13 and under and those between the ages of 13 and 18—on the impact for young people of inhabiting what is increasingly a commercial environment? Until these questions are answered positively, the argument for a hard age of consent seems weak.

In contrast, in its current code of practice on processing personal data online, the ICO recommends a nuanced approach, advising would-be data collectors that:

“Assessing understanding, rather than merely determining age, is the key to ensuring that personal data about children is collected and used fairly”.


The current system places the obligation on the data controller to consider the context of the child user, and requires them to frame and direct the request appropriately. It underpins what we know about childhood: that it is a journey from dependence to autonomy, from infancy to maturity. Different ages require different privileges and levels of support.

If being GDPR compliant requires a hard age limit, how do we intend to verify the age of the child in any meaningful way without, perversely, collecting more data from children than we do from adults? Given that the age of consent is to vary from country to country—16 in the Netherlands, Germany and Hungary; 14 in Austria—data controllers will also need to know the location of a child so that the right rules can be applied. Arguably, that creates more risk for children, but definitely it will create more data.

In all of this we must acknowledge a child’s right to access the digital world knowledgeably, creatively and fearlessly. Excluding children is not the answer, but providing a digital environment fit for them to flourish in must be. There is not enough in this Bill to fundamentally realign young people’s relationship with tech companies when it comes to their data.

Much like the noble Lord, Lord Knight, my view is that we have got this all wrong. In the future, the user will be the owner of their own data, with our preferences attached to our individual online identity. Companies and services will sign up to our bespoke terms and conditions, which will encompass our interests and tolerances, rather than the other way round. If that sounds a little far-fetched, I refer noble Lords to the IEEE, where this proposal is laid out in considerable detail. For those who do not know the IEEE, it is the pre-eminent global organisation of the electrical engineering professions.

While this rather better option is not before us today, it must inform our understanding that the Bill is effectively supporting an uncomfortable status quo. Challenging the status quo means putting children first, for example by putting the code of practice promised in the Digital Economy Act on a statutory footing so that it is enforceable; by imposing minimum design standards where the end-user is likely or may be a child; by publishing guidance to the tech companies on privacy settings, tracking, GPS and so forth; by demanding that they meet the rights of young people in the digital environment; and by a much tougher, altogether more appropriate, regime for children’s data.

All that could and should be achieved by May, because it comes down to the small print and the culture of a few very powerful businesses for which our children are no match. The GDPR offers warm words on consumer rights, automated profiling and data minimisation, but with terms and conditions as long as “Hamlet”, it is disingenuous to believe that plain English or any number of tick boxes for informed or specific consent will materially protect young people from the real-life consequences of data harvesting, which are intrusive, especially when we have left the data poachers in charge of the rules of engagement.

We could do better—a lot better. I agree wholeheartedly with other noble Lords who are looking for structures and principles that will serve us into the future. Those principles should not only serve us in terms of other EU member states but be bold enough to give us a voice in Silicon Valley. In the meantime, the Government can and should enact the derogation under article 80(2) and in the case of complainants under the age of 18, it should not only be a right but a requirement. We cannot endorse a system where we create poster children on front-line battles with tech companies. We are told that this Bill is about data protection for individuals—a Bill that favours users over business and children over the bottom line. But the absence of Article 8 of the European Charter of Fundamental Rights is an inexcusable omission. The Bill in front of us is simply not robust enough to replace Article 8. I call on the Government to insert that crucial principle into UK legislation. It must be wrong for our post-Brexit legislation to be deliberately absent of underlying principles. It is simply not adequate.

I had a laundry list of issues to bring to Committee, but I think I will overlook them. During the debate, a couple of noble Lords asked whether it was possible to regulate the internet. We should acknowledge that the GDPR shows that it can be done, kicking and screaming. It is in itself a victory for a legislative body—the EU. My understanding is that it will set a new benchmark for data-processing standards and will be adopted worldwide to achieve a harmonised global framework. As imperfect as it is, it proves that regulating the digital environment, which is entirely man and woman-made and entirely privately owned, is not an impossibility but a battle of societal need versus corporate will.

As I said at the beginning, my central concern is children. A child is a child until they reach maturity, not until they reach for their smart phone. Until Mark Zuckerberg, Sergey Brin and Larry Page, Tim Cook, Jack Dorsey and the rest, with all their resources and creativity, proactively design a digital environment that encompasses the needs of children and refers to the concept of childhood, I am afraid that it falls to us to insist. The Bill as it stands, even in conjunction with the GDPR, is not insistent enough, which I hope as we follow its passage is something that we can address together.

20:14
Lord Marlesford Portrait Lord Marlesford (Con)
- Hansard - - - Excerpts

My Lords, I very much agreed with those who said that the regulation must certainly apply to the big boys in the computer and digital world. I shuddered when the noble Baroness, Lady Lane-Fox, quoted from that wholly incomprehensible Brussels jargon from the regulations.

I received last week a letter as chair of Marlesford Parish Council. We have seven members and only 230 people live in Marlesford. Our precept is only £1,000 a year. A letter from the National Association of Local Councils warned me that the GDPR will impose,

“a legal obligation to appoint a Digital Protection Officer … this appointment may not be as straightforward as you may be assuming, as while it may be possible to appoint an existing member of staff”—

we have no staff, just a part-time parish clerk who is basically a volunteer. It continues:

“They must by requirement of regulations possess ‘expert knowledge of data protection law and practices’”.


I am afraid that will not be found in most small villages in the country, so I hope that one result of this Bill will be to introduce an element of proportionality in how it is to apply, otherwise the noble Baroness, Lady Lane-Fox, who was so right to draw our attention to the threat of incomprehensibility, will be right and we will all lose the plot.

The time has come to have a reliable and secure link between the state and its citizens, and the capabilities of the digital world that underlie this Bill give us that opportunity. There are good reasons for that. First, apart from the excellent national census which was founded in 1841, with the latest information having been collected in the 2011 census, Governments have an imperfect knowledge of their customers, paymasters or stakeholders—whatever you would like to call the rest of us. The various links have many defects which result in serious failures in the duties and obligations of the state. The first of those is to ensure that those who need financial help or support get it and do not go short as a result of funds going to those who do not need them or are not entitled to them. In this, the national insurance system has been incredibly difficult to organise properly. Again and again people have tried, and again and again they have failed.

Secondly, the National Health Service, which many of us believe to be a pillar of our British way of life, is chronically short of funds. Large sums are spent on free medical treatment for those who are not entitled to it. For example, under the reciprocal healthcare scheme within the EU, which is based on repayments made by each EU Government, we pay more than 10 times as much to other EU Governments for their treatment of our citizens as we collect for treating theirs. That is a gap of £500 million. In the case of the NHS treatment of non-EU citizens, the failure to collect charges now costs £1 billion a year.

Thirdly, control of our borders is inadequate, largely due to the failure of our passport system, an issue I have raised many times in your Lordships’ House.

Fourthly, there are serious defects in policing, combating digital crime and other aspects of law and order. To give just two examples, there are problems for our security services in protecting us from terrorism and identity theft, which is a growing problem. My proposal involves giving every citizen a unique identification number that would be backed by centrally held biometrics to confirm the identity of the citizen. The UIN would supplement and eventually replace the plethora of other state numbers, which include those for national insurance, the registry of births and deaths, national health, HMRC, passports, driving licences, the police national computer, the national firearms register and custodial sentences. Citizens would be required to know their own UIN and to give it to those with a legitimate reason to ask for it. The UIN would be printed on passports, driving licences and so on. To assist those without such documents, it might be helpful to make available a plastic card with the person’s name and UIN. Such a card would not be mandatory and it would have no validity in and of itself. It would not of course be an identity card, any more than a credit card or business card would be. Needless to say, it would have no biometrics of any sort on it.

Access to the biometrics would be carefully restricted to those on a need-to-use basis, and those with such access would have data relevant only to their need to know. The verification process would be based on real-time use of the biometrics. The authority would take the biometrics from an individual when necessary, and such action would be limited to appropriate members of government agencies. They would include the police, immigration officers, security people and so on. The biometrics could then be compared with the central record. Important decisions to be made would include which biometrics should be used, such as facial recognition techniques, fingerprints and so forth. The introduction of the UIN would be gradual, depending on the logistics of collecting the biometrics. Existing numbers would continue to be used for a while. Proper data protection would be key to the viability, security, integrity and public acceptability of the UIN. All I am asking is for Her Majesty’s Government to set up a study of what I propose. I am afraid I am not very confident that they will.

In 1997, I tabled an amendment to the Firearms (Amendment) Bill to set up a national electronic record of all firearms, similar to the excellent one that had long been in use by the DVLA. The amendment was passed and became part of the Act, but for the next 10 years the Home Office used every technique from the “Yes Minister” book to resist implementing it. Thanks to widespread support in this House—including from, if I may so, the noble Lord, Lord McNally, in his ministerial position—the amendment was eventually accepted and it has been in useful operation for the past 10 years.

However, I am worried about whether the Government always move as fast as they should on these computer matters. Sometimes they seem rather out of their depth. I remember, in 1966, as a keen young member of the Conservative Research Department, I was sent to carry the bag and take notes for Ernest Marples, a great political figure, around the world, to America and Japan, to see how we could use new techniques—electronic techniques and all the rest—to run the Government better. When I came back, all bright-eyed and bushy-tailed, I met a very senior official, a charming Under-Secretary from the Ministry of Health. I said to him, “You know, I’ve just been in America and Conrad Hilton has this wonderful system. He tracks everything that happens in his hotels: where the money goes, what the clients do and all the rest of it. Your hospitals are really rather like hotels—couldn’t you start doing the same?” He looked at me and shook his head and said, “Mark, before we spend government money on computers, we have to be sure they are here to stay”.

20:26
Earl of Lytton Portrait The Earl of Lytton (CB)
- Hansard - - - Excerpts

My Lords, I start by thanking the Minister for the opportunity to meet him and officials earlier today.

I welcome the stated purpose of the Bill. In my mind, it must be sensible to unify and consolidate the law in this area, and to update its application to more recent technologies. Bringing the GDPR into UK law is unquestionably desirable. I have been impressed by the GDPR’s elegance and sense of purpose, following, as it does—or claims to do—the European Charter of Fundamental Rights in 88 pages of self-reinforcing statements of principles.

I cannot go on without welcoming the EU Select Committee’s report, so ably spoken to by the noble Lord, Lord Jay, who I see is not in his place. I think it is a pity that the report did not have its own slot. Despite acknowledging that the Bill fleshes out the regulation to make it member-state applicable, like the noble Lord, Lord Stevenson, I worry about a Bill of 218 pages and an explanatory note of 112 pages, plus a departmental pack of 247 pages to deal with it all. That all adds to the complexity. I admit that the GDPR conceals its highly challenging requirements in wording of beguiling simplicity under the flag of private rights, but it is no wonder that the European Parliament did not want its handiwork contextualised by inclusion in what we have before us. It is not a particularly encouraging start to bringing 40 years of EU legislation into domestic law.

In what I felt was an inspirational contribution, the noble Baroness, Lady Lane-Fox—I am sorry she is not in her place—referred to the tortuous use of language in parts of the Bill. I agree with her—parts of it are gobbledygook that deny transparency to ordinary mortals. She referred also to my direct ancestor, Ada Lovelace, some of whose expressions of mathematical principles, even for a non-mathematician such as me, make a good deal more sense than parts of the Bill.

The Bill sets out to replace the 1998 Act with new GDPR provisions, meaning new and enhanced rights of data subjects for access, portability and transparency, and duties on controllers on specific consent—not by default, it should be noted—procedural audit trails, a more clearly defined regulatory and supervisory framework, and potential for substantially increased fines for infractions. There is enough that is new, apart from public expectations and the revised geometry as between data subject and data controller, which will naturally give rise to a fresh view of precedent and practice.

Consistency of the Bill with the GDPR core principles, as well as the fundamental rights upon which it is based, will be our focus at the Bill proceeds. A lot of organisations will need to review the way in which they are authorised, in their logging of the origins and possible destinations of personal data they hold, as well as the protocols for responding to requests for information from data subjects. I do not doubt that there will be some pitfalls for the unwary. It may no longer be possible to rely on the continuing acceptability and lawfulness of the previous arrangements under which they have operated, nor to second guess with accuracy how regulation and enforcement will unfold henceforward.

So there may be something going well beyond the more benign narrative of updating, modernising and extending the application on its own. There seem to be some particularly uncharted waters here, with the burden of proof as to compliance and adequacy of arrangements being firmly in the lap of the controller on what looks very like a strict liability basis. That alters the geometry of what will be dealt with.

As regards international cross-jurisdictional data— I am thinking of beyond the EU—I wonder how successfully the proposed arrangements will carry forward in the longer term, bearing in mind that the world market contains numerous players who for their own purposes and advantage might not be that keen to match the standards we claim to set for ourselves. Indeed, the construct of ethical data comes to mind, with all the usual caveats previously associated with ethical foreign policy—the noble Lord, Lord Knight, referred to the ethics; I agree with him that there is a strong threat. That would follow a global principle that sits behind GDPR.

The GDPR is hypothecated on the principle of individual compliance of each processor enterprise, so in a data-processing daisy chain across continents the continued tying in to the tenets of the GDPR is an obvious practical problem with some limitations and it should give us cause for reflection, although I have some admiration for the algorithm that the GDPR sets out to create.

I question how the Government view the ongoing processing of more historical personal data, referred to by other noble Lords, when the purpose for collecting it or the basis for any implied or deemed consent either had not been met or should long since have been refreshed or treated as expired. We all know that old data is still sloshing around in the ether, some of it potentially of dubious accuracy, but I merely point to the fact that this is often an ongoing processing operation without beginning or end point or any apparent possibility of amending or deleting records, as mentioned by other noble Lords. The amount of screening needed to ensure accuracy would be vast. I am entirely unclear that this Bill or the GDPR will improve things for those data subjects for whom this sort of thing can be harmful. I am not thinking just of social media. How will legacy data be dealt with, especially as it does not seem to have been entirely successfully corralled by the 1998 Act or by all other member states under the 1995 data protection directive? I see the correction of that as one of the fundamental principles behind the GDPR—it is the trip wire which has been put there deliberately.

I have concerns about some of the “get out” provisions included in the Bill. The first is the “too difficult” excuse; businesses already use this as a blocking measure. How does one get round the argument that it is too difficult to extract the individual personal data despite knowing that it is the targeted agglomeration of such data, relating to a natural individual, that is the outcome of the processing? The second is that the request is regarded as vexatious. This of course can be concocted by the simple expedient of being evasive towards the first two requests and from the third onwards treating it as repetitive or vexatious—it already happens. I would like reassurance from the Minister that the basic individual rights promised under the GDPR cannot be so circumvented.

The third excuse is “too much data”, referred to by other noble Lords; in other words, there is a lot of personal data held on an individual data subject. Here, there is a provision that the data controller may decline to give information if the precise nature of the data sought is not specified. My impression is that failure of a data subject to specify allows the controller to become unresponsive. If that is the intention, it seems to me to fail the broader test of article 14 of the GDPR, the basic premise of which is that the data subject is entitled to accurate and intelligible information.

It cannot be assumed that the data subject already knows what the scale and nature of the data held actually are or precisely who holds it, although it is clear that the GDPR gives an entitlement to this information. It must follow that, at very least, the controller, in making his “too much data” response, has to identify the general nature, categories and type of data held about that person. I invite the Minister to comment on what is intended. I concur very much with the point so eloquently made by the noble Baroness, Lady Lane-Fox, on the asymmetry of technical knowledge, resource and political clout as between the data subject and the controller, particularly when set against the practical challenge of extracting individual personal data in response to a formal request.

I was reminded of something only yesterday, as a result of a question as to whether a person was or was not at a certain place at a certain time, which was averred by a complainant in a harassment case who used CCTV footage they had created themselves. It was pointed out that the person against whom the complaint was made said they were somewhere else, in a retail premises covered by other CCTV footage. However, it appeared that the retail premises operator would not release the data because it also contained images of other people and there were, accordingly, privacy issues. What is the balance of rights and protections to be in such a case, where somebody faces prosecution?

That leads me to the issue of data collected by public bodies and agencies. I do not think it is generally understood what personal data is shared by police, social services, health bodies and others, some of them mentioned by the noble Lord, Lord Marlesford. Indeed, I am clear that I do not know either, but I believe that many of these agencies hold data in a number of different forms and on a variety of platforms, many of which are bespoke and do not readily talk to other systems. The data are collected for one purpose and used for other purposes, as the noble Lord, Lord Knight, rightly observed. It is on record in debates in this House that some of these bodies do not actually know how many data systems they have, even less what data—whether usable, personal, relevant or accurate, as the case may be—they actually contain. How does one enforce that situation? Some of these databases may not even be operating with the knowledge of the Information Commissioner. There will be an expectation that that is going to be tightened up.

A considerable measure of latitude is afforded to the processing of personal data in the public interest. I will be very brief on this point. I would not rest easy that we have an adequate separation of genuine public interest from administrative convenience and I looked in vain for clarification as to what public interest would amount to in this context. I have to say that I am even more confused than I was when I started. In the longer term it remains to be seen how the GDPR will work, incorporated into UK law, interpreted and enforced firstly through our domestic courts under the aegis of the EU but subsequently on a twin-track basis, when we will be dealing with it ourselves through the precedents of our own judicial system and the same GDPR will be being looked at in a European context elsewhere.

I want the Bill to work; I want to enable proper business use of data and to empower data subjects, as the GDPR promises, with a minimum of obfuscation, prevarication and deceit. Transparency has not been the hallmark of UK data businesses or government administration in this respect, but without it there is no justice, due process or citizen confidence in the rule of law and it will be corrosive if we do not get this right. However, I do not see any fundamental mismatch between this and best business practice, so I look forward to further debates on the Bill as we proceed.

20:38
Lord Mitchell Portrait Lord Mitchell (Non-Afl)
- Hansard - - - Excerpts

My Lords, the Data Protection Act was introduced in 1998. In those days, Facebook, Google and Uber did not exist, Amazon was barely four years old, Apple was tottering under the imminent threat of bankruptcy, search engines were rudimentary, as was the internet itself, and it would be another nine years until the iPhone would be launched. It was, indeed, a very different world. While I welcome the Bill, it remains a fact that when it becomes an Act next year it will be 20 years since its predecessor was enacted. Information and digital technology are growing exponentially. No other industry in the history of the world has even come close to this rate of growth. Legislation needs to match and anticipate the speed of these developments. Certainly, we cannot wait until 2037 for the next Data Protection Act.

Today I am going to raise three issues, which I would like the Minister to respond to. They all centre on the dominant and predatory behaviour of the American big tech giants. I will give your Lordships a striking example of such behaviour from one of them: Apple. In an ideal world, I would like every Member here who has an iPhone to take it out and turn it on, but that probably contravenes the Standing Orders of your Lordships’ House. So I will do the next best thing: I will set out five iPhone directions and, in the cool of the evening, when noble Lords have Hansard in front of them, they can replicate what I am now going to demonstrate.

Click on Settings, then Privacy, then Location Services. Then scroll all the way down until you see System Services, and then scroll halfway down and click on something called Significant Locations. If you are a little behind the times and do not have iOS 11, it is called Frequent Locations. You will probably be asked for a password. Then you will see History and a list of locations. Click on any one of them. Your Lordships will be staggered by what is revealed: every single location that you have visited in the past month—when you arrived, when you left, how long you stayed—all this very private and confidential information is starkly displayed. Who gave Apple permission to store this information about me on my iPhone? It is the default setting, but Apple never asked me. It will argue, of course, that it is private information and it has no access to it—maybe. If you think about it, the opportunities for snooping on people very close to you are endless and dangerous. Now the latest iPhone, the iPhone 8, has facial recognition. It does not take much imagination to work out how somebody could get access to a close member of your family and find out where they have been for the past month, without their permission to do it.

I think it was the noble Baroness, Lady Kidron, who spoke about Apple and its terms and conditions. She said that they were longer than “Hamlet”. I read that the iTunes terms and conditions were longer than “Macbeth”. Well, “Macbeth” or “Hamlet”, whatever it is, it is an awful lot of words. Of course, you have no opportunity to change those terms and conditions. You either agree or disagree. If you disagree, you cannot use the phone. So what choice do you have?

I see this as typical big tech behaviour. These companies run the world according to their rules, not ours. I have long campaigned against the cavalier approach of big tech companies in all aspects of business and personal life. These include Facebook, Amazon, Microsoft, Google and, of course, Apple. I was going to make some quip about the west-coast climate and the breezes of the west coast, but I guess with the news of the past two days that is probably not a good thing to be doing. Big tech companies have become mega-libertarians, positioning themselves above Governments and other regulators. They say they are good citizens and abide by the law. They have corporate mantras which say, “Do no evil”, but they stash away hundreds of billions of stateless, untaxed dollars. They promote end-to-end encryption. They are disingenuous when foreign Governments try to influence democratic elections. Perhaps they do no evil, but neither are they the model citizens they say they are.

So full marks to EU Commissioner Margrethe Vestager for bringing Apple, Google and Amazon to task, and full marks to President Macron for his efforts to set up an EU-wide equalisation tax to ensure that corporation tax is based on revenue, not creative accounting. I know that this is a DCMS Bill and international taxation is outside the Minister’s brief, but I have heard the Prime Minister criticise these tax dodges by big tech so I ask him or his colleagues in the Treasury: will the Government support the French President in this campaign?

I now turn to another area which is giving me great concern, which is digital health and health information in general. One of the great treasures we have in this country concerns our population’s health records. The NHS has been in existence since 1948 and in those 70 years the data of tens of millions of patients have been amassed. They are called longitudinal data, and they are a treasure trove. Such data can be instrumental in developing drugs and advanced medical treatment. Few other countries have aggregated such comprehensive health data. It puts us in pole position. However, in 2016 Royal Free London NHS Foundation Trust sold its rights to its data to a company called DeepMind, a subsidiary of—yes, noble Lords have guessed it—Google. The records of 1.6 million people were handed over. In June this year, Taunton and Somerset NHS Foundation Trust signed a similar deal with DeepMind. The data are being used to create a healthcare app called Streams, an alert, diagnosis and detection system for acute kidney injury, and who can object to that? However, patients have not consented to their personal data being used in this way.

Ms Elizabeth Denham, the Information Commissioner, has said that the Royal Free should have been more transparent and that DeepMind failed to comply with the existing Data Protection Act, but the issue is much graver than not complying with the Act. I do not know this for sure, but if I had to bet on who negotiated the better deal, Google or the Royal Free, I know where my money would be. DeepMind will make a fortune. I put this to the Minister: does he agree that NHS patient data are a massive national asset that should be protected? Does he agree that this mass of patient data should not be sold outright in an uncontrolled form to third parties? I know the NHS is strapped for cash, but there are many better ways of maximising returns. One way would be for NHS records to be anonymised and then licensed rather than sold outright, as is common with much intellectual property. I also believe that the NHS should have equity participation in the profits generated by the application of this information. After all, to use the vernacular of venture capital, it, too, has skin in the game.

As today’s debate has shown, there are fundamental questions that need to be answered. I have posed three. First, what protection will we have to stop companies such as Apple storing private data without our express permission? Secondly, will the UK support the French President in his quest for an equalisation tax aimed at big tech? Finally, how can we protect key strategic data, such as digital health, from being acquired without our permission by the likes of Google?

20:48
Viscount Eccles Portrait Viscount Eccles (Con)
- Hansard - - - Excerpts

My Lords, I think I should introduce my wife to the noble Lord, Lord Mitchell. She has some worries about Apple and, come to think of it, she has probably been snooping on me.

I shall spend my time on the European Union Committee’s third report. I very much welcome the Motion tabled by the noble Lord, Lord Jay, and the very measured way he introduced the report. I heartily agree with the noble Lord, Lord Stevenson, and my noble friend Lady Neville-Jones that we want the committee to go on studying these matters so that we come to understand them better than we do. That seems very important because an aspect of this Bill is that it is a pre-Brexit negotiation Bill. All the things in the Bill are of massive interest, as has been illustrated, but, as I understand it, in the Government’s mind it is a preparation for the negotiations that will inevitably follow, given the timing of the introduction of the GDPR and the triggering of Article 50. Of course, the provisions of the GDPR come under the single market in the systems of the European Union, which makes it even more important that we think very carefully about where we are and how we can make the best of it.

I have to admit that I do not think the starting point is a very good one. It seems to me that we used to understand that the European Union method of negotiation was that nothing is agreed until everything is agreed, but it has thrown that out of the window and this is not the way this negotiation is going. If nothing is agreed until everything is agreed, you have to have discussed everything before you come to the conclusion, but this is not where we are. The Commission keeps saying, “You are bad boys and have not offered us enough”, so the starting point is not very good, which raises the question of where data protection will come in to these negotiations.

I admire the Explanatory Notes—as I think the noble Lord, Lord Stevenson, did—which are a pretty good document compared to other Explanatory Notes that I have seen in the past. I was also interested in the August statement of intent, which was full of good intentions. But I think I rely more on the evidence that was given to the committee of the noble Lord, Lord Jay, and on that committee’s conclusions. Its central conclusion was that we should seek to achieve an “adequacy” decision. The report goes on, positively, to make recommendations on other difficulties such as the arrangements with the United States, as well as on the maintenance of adequacy, how it might be achieved and the continuance of shared policy.

I will offer just a word about “adequacy” and the use of language. The word “fairly”, which has no meaning in a court, has been used this afternoon. The word “adequacy” is pretty subjective. It has always been the Commission’s tendency to want to use words that are difficult to understand and have no clear meaning in English, such as “subsidiarity”—although that has not come into this part of our campaigning. Common sense tells us that both we and the European Union would be sensible to want to maintain data flows, with adequate protection. That is to say, although the present regime is not perfect, we would want it to continue and to improve.

However, unfortunately, our Brexit vote of no confidence in the Commission and in the project that it pursues has left us in an embarrassing and, it must be said, unfriendly negotiating atmosphere. What is more, our previous contributions following the Council of Europe’s Convention 108 have been very considerable. We not only started the ball rolling, together with many other members of the European Union—Germany, Austria, France and so on—with legislation in 1984, but we assisted a great deal in the run-up to the directive of 1995, when the European Union came into the action, somewhat after it had started; 10 years in fact. Then we had the 1998 Act, on which people have commented. With its 74 clauses and 16 schedules, it has done rather well in the circumstances of a changing world. However, that now seems not to help us with the Commission. We have been very helpful but now we have decided to walk off the pitch, and I think people do not like it if you leave in the middle of the game.

What we need from the Commission, as we have had on other occasions, is a flexibility of response, but I am afraid that is not the Commission’s strong point. Nor is its attitude to the Council of Europe, which started the process of Convention 108. I am not convinced that it will be full of joy at the Council of Europe modernising Convention 108. The EU has made an effort to become a member of the Council of Europe, so far unsuccessful. A personal reflection: if it were to be successful, with 27 or 28 votes out of 47, I suppose it would hope to take charge.

We are the defaulters, seen as obstinate, self-interested and unable to recognise the need for ever-closer union. And so we have this Bill. It is a sensible effort to get and remain in line with EU regulation—to show and share equivalence—even if in two places, I suspect much to the parliamentary draftsman’s distress, we qualify it with the adverb “broadly”. I am also sure we are right that we should be looking for an adequacy decision but, despite the excellent report and its very clear and admirable conclusions, will the Commission reciprocate? It will always be easy to quibble with third-country adequacy. It is a very complex subject and there will never be any difficulty in disagreeing with something; your Lordships have demonstrated that very clearly this afternoon. There is no perfect answer, certainly not one that will withstand the changes that make even a very good answer not such a good one later. So I am afraid my conclusion is that, unless things change, the Commission will continue to find fault with however manfully we try to satisfy its requirements. Is there then a chance that there will be some political intervention, some repetition of the statesmanlike behaviour of European politicians in 1949, the starting year of the Council of Europe? We have about a year to find out. Maybe, but I would not bet on it. No deal on this matter by default seems increasingly likely.

20:58
Viscount Colville of Culross Portrait Viscount Colville of Culross (CB)
- Hansard - - - Excerpts

My Lords, I am going to deal with my concerns about how the Bill might affect journalism and free speech. I declare my interest as a series producer at ITN Productions.

In the fast-changing world of the digital revolution, it is beholden on noble Lords to be vigilant about the way in which our personal data is now so readily available to so many people to be processed in so many ways, more than many of us ever conceived. I am glad that the GDPR has been brought forward and that this Bill protects further the availability and use of personal information. However, I am concerned that these new privacy rights will be balanced with further limitations on the freedom of the press and the ability of journalists to carry out investigative journalism in the public interest, which I believe was one of the original aims of the Data Protection Act 1998.

At the moment, data protection legislation is being used to control unwelcome exposure of incriminating personal information by journalists. We have seen cases such as that of Prince Moulay v Elaph Publishing, in which the original case for defamation was thrown out as not libellous, only for the Prince to instigate proceedings for the incriminating information against him to be removed from the public sphere using data protection law, despite the intention of the original Act being that there should be an exemption for journalism.

I understand the sentiment behind the “right to be forgotten” clause. Of course, many people want their youthful indiscretions to be forgotten and, for most, it is important that they should be. This concept is based on the Costeja v Google Spain case, which stopped links being made to personal information in search results. However, the courts are now being tested to see whether the original information itself can be suppressed.

In the age of fake news, it has never been more important to be able to go back to source material to check original data against more recent updates and deletions. Noble Lords will have heard of click bait, where sites are specifically set up to shock with false information to attract eyeballs—as they call them in the industry—and make money from the resultant advertising. Noble Lords must not suppress the means to refute such fake news and ascertain the truth.

So I am very pleased that GDPR article 17 has an exemption for publication of data for free speech and the holding of archives in the public interest, further safeguarded in article 89. However, Clause 18, which indeed provides welcome protection for many archives held in the public interest—for instance, those for historical, scientific and statistical purposes—does give protection to cover media archives.

My concern is that past media articles are an important source for verifying information. They might hold reports of criminal convictions of the person or information about a politician’s past which, years later, when they are trying to stand for office, might prove embarrassing but informative for voters. Surely business people, voters and many others should have full access to the information in those archives, whether it is embarrassing or not. This information helps them to shape a fuller profile of the person whose reputation they are trying to assess.

In the digital age, there are millions of opinions, but refuting falsehoods or discovering the truth has never been more difficult. The only way to do that is through source material on trusted websites or archives, where the information has been mediated and checked. I suggest that websites holding archives of trusted media organisations should be protected by and covered in the Bill. The inherent public interest in such archives should be explicitly recognised, as provided in the GDPR.

I am pleased that there is an exemption for data processing for journalism in Schedule 2, part 5, paragraph 24. However, in sub-paragraph (2), there is concern that the exemption applies only when the processing of data is used for journalism. If this information, once it has been gathered for journalism, is subsequently used by the regulators or the police, the use of the word “only” will negate that exemption. I ask the Minister to look at that again.

I am also concerned about the extension of the powers of the ICO prior to publication to examine whether information is exempt from data protection provisions because it is being processed for journalism. GDPR article 6 contains an obligation to consult the Information Commissioner, but Clause 164 goes much further. It enhances the power of the ICO to examine the application of the exemptions for journalism prior to publication and unilaterally second-guess editorial decisions made in respect of the provisions in the Bill.

This means that if a journalist is investigating, for instance, people smugglers, involving undercover filming or subterfuge which is deemed to create a high risk to data subjects, the ICO can intervene prior to publication. The commissioner has the power to apply their objective view to the claim, which might overwrite and disregard the reasonable view of an editor. The ICO might, for example, call for the individual being investigated to be notified in advance that their data is being used, or that they should be given access to additional data being held about them as part of the journalistic investigation.

In my view, this is not even consistent with the terms of journalistic exemption. It would result in investigative journalism being delayed or even stopped until the ICO has examined it for compliance with part of the Act prior to publication. The provision could act as a form of censorship. The existing right of the editor to decide whether the story should go ahead in the public interest will therefore be eroded. I suggest that Clause 164 should be amended to ensure that investigative journalism is not chilled by the extension of powers of intervention by the ICO prior to publication.

Finally, I am concerned that there is no time limit on the right to sue in respect of information processed for special purposes, which continues to be retained or published in the media archive. Under the Defamation Act, that limitation was one year from the date of publication. Under this Bill, there is no limitation. Surely, if information is inaccurate, the complainant should sue within a specific period. The longer the case is delayed from the original publication date, the more difficult it is to refute the allegations. The journalist could move on, contact with the original source material might be lost, memories blurred and notes, even those held digitally, mislaid. Complainants must have the right to complain, but there must be a balance with the time period when that can be done. A failure to have a period of limitation will surely be a chilling effect on the publication of information.

I welcome this Bill as an important advance in protecting privacy in the digital age, but I am concerned that some of its provisions do not yet strike the right balance between privacy and free speech. I ask the Minister to take my concerns seriously.

21:05
Baroness Neville-Rolfe Portrait Baroness Neville-Rolfe (Con)
- Hansard - - - Excerpts

My Lords, I congratulate our Ministers and the Government on bringing this Bill to our House in this timely way. It is extremely technical—and herein lies a danger, because it is also very important and covers matters that can be expected to become even more important over time. We must therefore put aside the temptation to think that technical matters are somehow of lesser importance, simply because we do not fully understand them. I declare an interest as the Minister responsible when the EU parent of this Bill, the GDPR, was adopted. While I saw it as a necessary single market measure and a modernising one, there were a number of provisions that we could have done without, mostly introduced by the European Parliament, such as requiring a specific age of consent, which the Government have now proposed should be 13 in the UK, in line with the United States.

In contrast, as always, our UK approach is market opening. We want a competitive, growing Europe, and we want the digital revolution, with its subset artificial intelligence, to continue to stoke growth. But some in the EU have always been most concerned with giving citizens back control over their personal data, an issues that assumed particular importance following the release of documents involving Chancellor Merkel by WikiLeaks. To be fair, the UK has also in this case stated its wish to simplify the regulatory environment for business, and we need to make sure that that actually happens here in the UK. Committee will give us the chance to talk about the merits of the digital revolution and its darker side, which we touched on during the excellent debate led by the noble Baroness, Lady Lane-Fox. I shall not go over that ground again now, but I add one point to the story told by the noble Lord, Lord Mitchell: my Google Maps app now highlights the location of future engagements in my diary. So that is pretty challenging.

I shall touch as others have done on three concerns. According to the Federation of Small Businesses, the measures represent a significant step up in the scope of data protection obligations. High-risk undertakings could phase additional costs of £75,000 a year from the GDPR. The MoJ did an impact assessment in 2012, which is no doubt an underestimate, since it did not take account of the changes made by the European Parliament, which estimated the cost at £260 million in 2018-19 and £310 million by 2025-26. I am not even sure if that covers charities or public organisations or others who have expressed concerns to me about the costs and the duties imposed. Then there are the costs of the various provisions in the Bill, many levelling up data protection measures outside the scope of the GDPR. It is less confusing, I accept, but also more costly to all concerned.

The truth is that overregulation is a plague that hits productivity. Small businesses are suffering already from a combination of measures that are justified individually—pension auto-enrolment, business rates and the living wage—but together can threaten viability at a time of Brexit uncertainty. We must do all we can to come to an honest estimate of the costs and minimise the burden of the new measures in this legislation.

Also, I know that CACI, one of our leading market analysis companies working for top brands such as John Lewis and Vodafone, thinks that the provisions in the Bill are needlessly gold-plated. Imperial College has contacted me about the criminalisation of the re-identification of anonymised data, which it thinks will needlessly make more difficult the vital security work that it and others do.

The noble Lord, Lord Patel, and the noble Baroness, Lady Manningham-Buller, were concerned about being able to contact people at risk where scientific advance made new treatments available—a provision that surely should be covered by the research exemption.

The second issue is complication. It is a long and complicated Bill. We need good guidance for business on its duties—old and new, GDPR and Data Protection Bill—in a simple new form and made available in the best modern way: online. I suggest that—unlike the current ICO site—it should be written by a journalist who is an expert in social media. The Minister might also consider the merits of online training and testing in the new rules. I should probably declare an interest: we used it in 2011 at Tesco for the Bribery Act and at the IPO for a simple explanation of compliance with intellectual property legislation.

The third issue is scrutiny. I am afraid that, as is usual with modern legislation, there are wide enabling powers in the Bill that will allow much burdensome and contentious subordinate detail to be introduced without much scrutiny. The British Medical Association is very concerned about this in relation to patient confidentiality. Clause 15, according to the excellent Library Note, would allow the amendment or repeal of derogations in the Bill by an affirmative resolution SI, thereby shifting control over the legal basis for processing personal data from Parliament to the Executive. Since the overall approach to the Bill is consensual, this is the moment to take a stand on the issue of powers and take time to provide for better scrutiny and to limit the delegated powers in the Bill. Such a model could be useful elsewhere—not least in the Brexit process.

There are two other things I must mention on which my noble friend may be able to provide some reassurance. First, I now sit on the European Union Committee. I am sorry that duties there prevented me sitting through some of this important debate; we were taking important evidence on “no deal”. As the House knows, the committee is much concerned with the detail of Brexit. Data protection comes up a lot—almost as much as the other business concern, which is securing the continued flow of international talent. I would like some reassurance from my noble friend Lady Williams about the risks of Brexit in the data area. If there is no Brexit deal, will the measures that we are taking achieve equivalence—“adequacy”, in the jargon—so that we can continue to move data around? What international agreements on data are in place to protect us in the UK and our third-country investors? Under an agreed exit, which is my preference, is there a way that our regulator could continue to be part of the European data protection supervisory structure and attend the European Data Protection Board, as proposed by the noble Lord, Lord Jay of Ewelme, the esteemed interim chairman of our European Union Committee—or is that pie in the sky?

Secondly, there is a move among NGOs to add a provision for independent organisations to bring collective redress actions for data protection breaches. I am against this proposal. In 2015 we added such a provision to competition legislation—with some hesitation on my part. This provision needs to demonstrate its value before we add parallel provisions elsewhere. It is in everyone’s interests to have a vibrant economy, but business is already facing headwinds in many areas, notably because of the uncertainty surrounding Brexit. In future it will be subject to a much fiercer data protection enforcement regime under our proposals.

I have talked about the costs and others have mentioned the new duties and there will be maximum fines of up to 4% of turnover for data breaches, compared with £0.5 million at present. We certainly do not need yet another addition to the compensation culture. This could reduce sensible risk taking and perversely deter the good attitudes and timely actions to put things right that you see in responsible companies when they make a mistake. There is a real danger that the lawyers would get to take over in business and elsewhere and give the Bill a bad name. That would be unfortunate.

However, in conclusion, I welcome the positive aspects of this important Bill and the helpful attitude of our Ministers. I look forward to the opportunity of helping to improve it in its course through the House.

21:15
Baroness O'Neill of Bengarve Portrait Baroness O’Neill of Bengarve (CB)
- Hansard - - - Excerpts

My Lords, as the last speaker before the winding speeches, I think it is my duty to be extremely brief, so I will try. We have had nearly 20 years of the Data Protection Act. We need this legislation because, if nothing else were the case, the United Kingdom will remain in the European Union on 18 May next year, which is the date of implementation of the new regulation, so we have to do something.

I will make a few rather sceptical remarks about the long-term viability of data protection approaches to protecting privacy. They have, of course, worked, or people have made great efforts to make them work, but I think the context in which they worked, at least up to a point, has become more difficult and they are less likely to work. The definition of personal data used in data protection approaches, and retained here, is data relating to a living individual who is identified, or can be identified, from the data. It is that modal idea of who can be identified that has caused persistent problems. Twenty years ago it was pretty reasonable to assume that identification could be prevented provided one could prevent either inadvertent or malicious disclosure, so the focus was on wrongful disclosure. However, today identification is much more often by inference and it is very difficult to see how inference is to be regulated.

The first time each of us read a detective story, he or she enjoyed the business of looking at the clues and suddenly realising, “Ah, I know whodunnit”. That inference is the way in which persons can be identified from data and, let us admit it, not merely from data that are within the control of some data controller. Data protection is after all in the end a system for regulating data controllers, combined with a requirement that institutions of a certain size have a data controller, so there is a lot that is outside it. However, if we are to protect privacy, there is, of course, reason to think about what is not within the control of any data controller. Today, vast amounts of data are outwith the control of any data controller: they are open data. Open data, as has been shown—a proof of concept from several years ago—can be fully anonymised and yet a process of inference can lead to the identification of persons. This is something we will have to consider in the future in thinking about privacy.

Moreover, throughout the period of data protection, one of the central requirements for the acceptable use of otherwise personal data has been that consent should be sought, yet the concepts of consent used in this area are deeply divisive and various. In commercial contexts, consent requirements are usually interpreted in fairly trivial ways. When we all download new software, we are asked to accept terms and conditions. This is called an end-user licence agreement. You tick and you click and you have consented to 45 pages of quite complicated prose that you did not bother to read and probably would not have understood if you had maintained attention for 45 pages. It does not much matter, because we have rather good consumer protection legislation, but there is this fiction of consent. However, at the other end of the spectrum, and in particular in a medical context, we have quite serious concepts of consent. For example, to name one medical document, the Helsinki Declaration of the World Medical Association contains the delicious thought that the researcher must ensure that the research participant has understood—then there is a whole list of things they have to understand, which includes the financial arrangements for the research. This is a fiction of consent of a completely different sort.

We should be aware that, deep down in this legislation, there is no level playing field at all. There are sectoral regimes with entirely different understandings of consent. We have, in effect, a plurality of regimes for privacy protection. Could we do otherwise or do better? I will not use any time, but I note that legislation that built on the principle of confidentiality, which is a principle that relates to the transfer of data from one party to another, might be more effective in the long run. It would of course have to be a revised account of confidentiality that was not tied to particular conceptions of professional or commercial confidentiality. We have to go ahead with this legislation now, but it may not be where we can stay for the long run.

21:21
Lord Paddick Portrait Lord Paddick (LD)
- Hansard - - - Excerpts

My Lords, this has been an interesting, and for me at times a rather confusing, debate on the issues associated with the Bill. The Bill is complex, but I understand that it is necessarily complex. For example, under European law it is not allowed to reproduce the GDPR in domestic legislation. The incorporation of the GDPR into British law is happening under the repeal Bill, not under this legislation. Therefore, the elephant and the prints are in the other place rather than here.

We on these Benches welcome the Bill. It provides the technical underpinnings that will allow the GDPR to operate in the UK both before and after Brexit, together with the permitted derogations from the GDPR available to all EU member states. For that reason it is an enabling piece of legislation, together with the GDPR, which is absolutely necessary to allow the UK to continue to exchange data, whether it is done by businesses for commercial purposes or by law enforcement or for other reasons, once we are considered to be a third-party nation rather than a member of the European Union.

We also welcome the extension of the effect of the GDPR—the rules and regulations that the GDPR provides—to other areas that are currently covered by the Data Protection Act 1998 but which are outside the scope of the GDPR, thus, as far as I understand it, providing a consistent approach to data protection across the piece. This leaves law enforcement and national security issues outside of the scope of GDPR and the “applied GDPR”, which are covered in Parts 3 and 4.

The enforcement regime, the Information Commissioner, is covered in Part 5, because we will repeal the Data Protection Act 1998 and so we need to restate the role of the Information Commissioner as the person who will enforce, and we will need to explore concerns that we have in each part of the Bill as we go through Committee. However, generally speaking, we welcome the Bill and its provisions.

Of course, what the Government, very sensibly, are trying to do but do not want to admit, is to ensure that the UK complies with EU laws and regulations—in this case in relation to data protection—so that it can continue to exchange data with the EU both before and after Brexit. All this government hype about no longer being subject to EU law after Brexit is merely the difference between having to be subject to EU law because we are a member of the EU and having to be subject to EU law because, if we do not, we will not be able to trade freely with the EU or exchange crime prevention and detection intelligence, and counterterrorism intelligence, with the EU. That is the only difference.

For most aspects of data exchange, compliance with the GDPR is required. The GDPR is directly applicable, so it cannot simply be transposed into this Bill. Coupled with the derogations and applying the GDPR to other aspects of data processing not covered by the GDPR makes this part of the Bill complex—and, as I suggest, probably necessarily so.

For law enforcement purposes, data exchange is covered by an EU law enforcement directive, which can be, and has been, transposed to form Part 3 of the Bill as far as I understand it. A data protection regime for the processing of personal data by the intelligence services—in the case of the UK, MI5, MI6 and GCHQ —is covered by Council of Europe Convention 108. Part 4 of the Bill is based on a modernised draft of Convention 108, which has yet to be formally agreed, but this puts the UK in effect slightly ahead of the curve on that aspect of regulation.

Clearly, we need to probe and test the derogations allowed under the GDPR that are proposed in the Bill, particularly when hearing about the potential consequences, as outlined by, for example, the noble Viscount, Lord Colville of Culross. We also need to examine whether applying GDPR rules and regulations to other areas of data processing provides equivalent or enhanced safeguards compared with those provided by the Data Protection Act, and we need to ensure that the safeguards provided by the law enforcement directive and Council of Europe Convention 108 are provided by the Bill.

As regards our specific concerns, as my noble friend Lord McNally mentioned in his opening remarks and as reinforced by my noble friend Lady Ludford, if the Bill results in a refusal to allow not-for-profit bodies to exercise Articles 77 to 79 to pursue data protection infringements on their own accord, we will have to challenge that, but perhaps the Minister can clarify whether that is the case.

As my noble friend Lady Ludford also mentioned, along with the noble Baroness, Lady Jay of Paddington, various provisions to allow Ministers to alter the application of the GDPR by regulation is something that we need much further scrutiny of, albeit that Ministers’ hands are likely to be tied by the requirement to comply with changing EU law after Brexit—de facto even if not de jure. Could it be—perhaps the Minister can help us here—that the purpose of these powers, put into secondary legislation, is to enable the UK to keep pace with changes in EU law after Brexit?

Although we welcome the ability of individuals to challenge important wholly automated decisions, requiring human intervention at the request of the data subject, research shows that the application of algorithms and artificial intelligence, even in machine learning of language, can result in unfair discrimination. Even when human decision-making is informed by automated processes, safeguards still need to be in place to ensure fairness, such as transparency around what the automated processes involve. While decisions around personal finance, such as credit scoring and the assessment of insurance risk, are important, in the United States the application of algorithms in the criminal justice arena has resulted in unfair discrimination that has even more serious consequences for individuals. Even if such automated processes are yet to apply to the UK criminal justice system, the Bill must safeguard against future developments that may have unintended negative consequences.

As other noble Lords have said, we have concerns about the creation of a criminal offence of re-identification of individuals. As the noble Lord, Lord Arbuthnot of Edrom, said, criminalising re-identification could allow businesses to relax the methods that they use to try to anonymise data on the basis that people will not try to re-identify individuals because it is a criminal offence.

Despite what is contained in this Bill, we have serious concerns that there are likely to be delays to being granted data adequacy status by the European Commission when we leave the EU. That means that there would not be a seamless continuation of data exchange with the EU 27 after Brexit. We also have serious concerns, as does the Information Commissioner, that there are likely to be objections to being granted data adequacy status because of the bulk collection of data allowed for under the Investigatory Powers Act, as the noble Lord, Lord Stevenson of Balmacara, said in his opening remarks. We also intend to revisit the issue of the requirement under international human rights law, and upheld by the European Court of Human Rights in 2007, that as soon as notification can be made without prejudicing the purpose of surveillance after its termination, information should be provided to the persons concerned.

As the noble Baroness, Lady Lane-Fox, mentioned, it is essential that the Information Commissioner is provided with adequate resources. My understanding is that there has been a considerable loss of staff in recent times, not least because commercial organisations want to recruit knowledgeable staff to help them with the implementation of GDPR, plus the 1% cap on public sector pay has diminished the number of people working for the Information Commissioner. It is absolutely essential that she has the resources she needs, bearing in mind the additional responsibilities that will be placed upon her.

The age of consent will clearly be an interesting topic for discussion. What we are talking about here is at what age young people should be allowed to sign up to Facebook or other social media. Most of us would acknowledge that children have a greater knowledge and are more computer literate than their parents and grandparents. As one of the surveys mentioned this evening showed, it would be very easy for young people to circumvent rules around the age of consent as set in legislation. For example, any teenager would know how to make the internet believe that they were in the United States when they were physically in the United Kingdom, and therefore they would have to comply only with any age of consent set in America. While I understand the burning desire for people to protect children and ensure that they are not exploited through social media, one has to live in the real world and look for solutions that are actually going to work: for example, educating young people on how to avoid being groomed online and the dangers of social media, and informing parents about how they can keep an eye on their children’s activities, rather than trying to set an unrealistic target for the age at which someone could sign up.

Finally, the noble Lord, Lord Mitchell, talked about the data privately stored on iPhones, which was informative. Last week, I was rather shocked when, in California, I went to a gym that was rather busy. I looked on Google Maps, which very helpfully informed me when the busiest times were in that particular gym on that particular day. I found that very useful, but I found it very frightening that it also told me that I had been at that gym three hours before.

21:35
Lord Kennedy of Southwark Portrait Lord Kennedy of Southwark (Lab)
- Hansard - - - Excerpts

My Lords, we welcome the Bill generally and support the main principles, but that is not to say that we do not have issues that we intend to raise during the passage of the Bill where we believe that improvements could be made. We will certainly test the Government’s assertion that the Bill will ensure that we can be confident that our data is safe as we make the transition into a future digital world.

My noble friend Lord Knight of Weymouth highlighted some of the challenges that we face in the use of data, the consent that we give and how we can have greater control—or, in fact, any control at all—as data and the use of data grow exponentially. In his contribution, the noble Lord, Lord Marlesford, highlighted the complexity of these matters. That is the problem—the constant growth in complexity and our ability to understand the changes as they run away with themselves. We are aware that there will be a number of government amendments to the Bill. When we see those, we will be able to take a view on them. But the fact that we can expect such a large number at this early stage of the Bill makes one wonder how prepared the Government are for this new challenge.

The broad aim of the Bill is to update the UK’s data protection regime in accordance with the new rules, as agreed at European level. It is important as we prepare to leave the European Union that we have strong, robust laws on data protection that ensure that we have up-to-date legislation that is on a par with the best in the world to protect individuals, businesses and the UK as a whole and to play our part in ensuring that the UK remains a place where it is difficult for criminals to operate. As the noble Lord, Lord Jay, said in his contribution covering the report of the European Union Home Affairs Sub-Committee, the amount of cross-border data flows to the UK cannot be overstated, with services accounting for 44% of the UK’s total global exports and three-quarters of the UK’s cross-border data flows being with other EU countries. The UK must remain a place where people and organisations all over the world want to do business and a place that has safety and robust protection at its heart.

The noble Baroness, Lady Lane-Fox of Soho, made important points about the need for the UK to be the best and safest place in the world to trade online. Her contribution to debates in your Lordships’ House to make the Bill the best it can be will be of vital importance as the Bill makes progress. The noble Baroness is right that a lot of education is needed to prepare the public and business for the changes.

The concerns of business must be taken into account. When the noble Baroness, Lady Williams of Trafford, responds to the debate, I hope she will refer to the concerns expressed by small businesses. In particular, will she explain what plans the Government have to ensure that small businesses are aware of the changes and the action that they need to take? These are the sorts of businesses that are the backbone of the country. They are not able to employ expensive lawyers or have compliance departments to advise them on the action that needs to be taken. We need a targeted awareness campaign from the Government and the regulator and small-business-friendly support and guidance rolled out in good time so that the necessary changes can be made. I fully understand the concerns that businesses have in this regard and the Government must respond to those positively.

The Bill implements the general data protection regulation—GDPR—standards across all general data processing and the Opposition support that. As we have heard in the debate, the UK will need to satisfy the European Commission that our legislative framework ensures an adequate level of protection. The Commission will need to be satisfied on a wide variety of issues to give a positive advocacy decision, and when we leave the European Union we will still have to satisfy the high adequacy standards to ensure that we can trade with the European Union and the world. Those too are matters that we will test in Committee.

Important principles of lawfulness in obtaining data and the consent of individuals to their data being held are set out in the Bill. My noble friend Lady Jay of Paddington made important points about how to achieve a better-educated public about the use of their data, the media and online literacy, and the risks to them of the abuse of their data.

The additional GDPR rights which strengthen and add to an individual’s rights, as set out in the Data Protection Act 1998, are a positive step forward. We have all seen examples of people’s data being held unlawfully and the measures in this Bill should help in that respect. There is also the issue of data held about all of us that is confidential, such as medical and health data, and ensuring that it is processed in a confidential way is something we would all support, alongside the proper use of health data to combat disease and improve healthcare through proper research. A number of noble Lords have made reference to that, and certainly nothing should be done which would endanger research that saves lives.

The right to be forgotten is an important concept, particularly where the consent was given as a child, although we will want to probe why the right of erasure of personal data is restricted to 18 years and above, particularly when the consent may have been given when the individual was 13 years of age. Cyberbullying is a dreadful experience for anyone and it is important that we are very clear during the passage of the legislation on how people are able to protect themselves from this abuse. The Bill will formalise the age at which a child can consent to the processing of data at 13 years in the UK, which is the lowest possible age in the EU. The right reverend Prelate the Bishop of Chelmsford referred to this point in his contribution and I agree with him about the need for further consultation with parents and the public, a point also made by the noble Baroness, Lady Howe.

The noble Baroness, Lady Kidron, made an excellent contribution and she is right to say that children are no match for a number of the very powerful tech companies. I too read carefully the briefings from the Children’s Society and YoungMinds on this matter. All the major online platforms have a minimum user age of 13, although the vast majority of young people—some 73% according to the survey—have their first social media account before they are 13. This is an issue that will rightly get a lot of attention from noble Lords. On reading the briefing note I could see the point being made that setting the age at 16 could have an adverse effect in tackling grooming, sexual exploitation and abuse. If we wanted to go down the route of increasing the age when someone can consent to the use of their personal data, we must at the same time make significant changes to the grooming and sexual offences legislation, again a point made by the noble Baroness, Lady Howe, in her remarks. It would be wrong to make this change in isolation because it actually risks making the online world more dangerous for young people.

In responding to the debate, will the noble Baroness, Lady Williams of Trafford, set out how the Government decided that 13 was the appropriate age of consent for children to access social media and does she believe, as I do, that the social media companies need to do much more to protect children when they are online? What consultation did the Government undertake before deciding that 13 years was the correct age, a question put by many noble Lords in the debate?

There are also the important issues of protecting vulnerable people in general, not only children but the elderly as well. As my noble friend Lord Stevenson of Balmacara said, the Government have an opportunity to allow independent organisations acting in the public interest to bring collective redress actions or super-complaints for breaches in data protection rules. They have not done so, and this may be an error on their part as the super-complaint system works well in other fields. It would enable an effective system of redress for consumers to be put in place. It could also be contended that just having such a system in place would have a positive effect in terms of organisations making sure that they are compliant and not tempted to cut corners, and generally make for a stronger framework.

The Opposition support the approach of transposing the law enforcement directive into UK law through this Bill. It is important that we have consistent standards across specific law enforcement activities. In the briefing, the Information Commissioner raised the issue of overview and scope as detailed in Clause 41. It would be helpful, when responding to the debate, if the Minister could provide further clarification in respect of the policy intention behind the restriction on individuals being able to approach the Information Commissioner to exercise their rights.

The processing of personal data by the intelligence services is of the utmost importance. Keeping their citizens safe is the number one priority of the Government. We need to ensure that our intelligence services have the right tools and are able to work within modern international standards, including the required safeguards, so that existing, new and emerging threats to the safety and security of the country are met. These are fine lines and it is important that we get them right.

The point made by a number of noble Lords, including the noble Lord, Lord Jay, and the noble Baroness, Lady Ludford, that our position as a third country on leaving the EU may leave us subject to meeting a higher threshold is a matter for concern. I hope the noble Baroness, Lady Williams, will respond to that specific point when she replies to the debate.

The Information Commissioner having an independent authority responsible for regulating the GDPR—which will also act as the supervisory authority in respect of the law enforcement provisions as set out in Part 3 of the Bill—is welcome, as is the designation of the commissioner as the authority under Convention 108. I welcome the proposal to consult the commissioner on legislation and other measures that relate to data processing. The commissioner has an important international role and I fully support her playing a role in the various EU bodies she engages with, up until the point when we leave the EU. We must also be satisfied in this House that we have sufficiently robust procedures in place so that we will work closely with our EU partners after we have left the EU. Failure to do so could have serious repercussions for the UK as a whole, our businesses and our citizens. Data flows in and out of the UK are a complex matter and the regulator needs authority when dealing with others beyond the UK. That is something we will have to test carefully as the Bill passes through your Lordships’ House.

The clauses of the Bill in respect of enforcement are generally to be welcomed. It is important that the commissioner retains the power to ensure data is properly protected. I agree very much with the noble Lord, Lord McNally, about the importance of ensuring that the Information Commissioner remains adequately funded. It is right that those powers are used proportionally in relation to the specific matters at hand, using, where appropriate, non-criminal enforcement, financial penalties and, where necessary, criminal prosecution. As I said, we need a proper programme of information to ensure that small businesses in particular are ready for the changes and new responsibilities they will take on.

One of the issues we have to address is the challenge that technology brings and how our legislation will remain fit for purpose and accepted by other competent authorities outside our jurisdiction—particularly by the European Union after we leave it.

In conclusion, this in an important Bill. As the Opposition, we can support its general direction, but we have concerns about the robustness of what is proposed. We will seek to probe, challenge and amend the Bill to ensure that it really does give us the legalisation the UK needs to protect its citizens’ data and its lawful use.

21:47
Baroness Williams of Trafford Portrait The Minister of State, Home Office (Baroness Williams of Trafford) (Con)
- Hansard - - - Excerpts

My Lords, this has been a lengthy but excellent debate. I very much welcome the broad support from across the House for the Bill’s objectives; namely, that we have a data protection framework that is fit for the digital age, supports the needs of businesses, law enforcement agencies and other public sector bodies, and—as the noble Lord, Lord Kennedy, said—safeguards the rights of individuals in the use of their personal data.

In bringing the Bill before your Lordships’ House at this time, it is fortunate that we have the benefit of two recent and very pertinent reports from the Communications Committee and the European Union Committee. Today’s debate is all the better for the insightful contributions we have heard from a number of members of those committees, namely the noble Lord, Lord Jay, the noble Viscount, Lord Colville, the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Chelmsford and my noble friend Lady Neville-Rolfe.

In its report Growing Up with the Internet, the Communications Committee noted with approval the enhanced rights that the GDPR would confer on children, including the right to be forgotten, and asked for those rights to be enshrined in UK law as a minimum standard. I am pleased to say the Bill does just that. The European Union Committee supported the Government’s objective to maintain the unhindered and uninterrupted flow of data with other member states following the UK’s exit from the EU. Understandably, the committee pressed the Government to provide further details of how that outcome will be achieved.

With the provisions in the Bill, the UK starts from an unprecedented point of alignment with the EU in terms of the legal framework underpinning the exchange and protection of personal data. In August, the Government set out options for the model for protecting and exchanging personal data. That model would allow free flows of data to continue between the EU and the UK and provide for ongoing regulatory co-operation and certainty for businesses, public authorities and individuals. Such an approach is made possible by the strong foundations laid by the provisions in the Bill.

In other contributions to this debate, we have had the benefit of a wide range of experiences, including from noble Lords who are able to draw on distinguished careers in business, education, policing or the Security Service. In doing so, noble Lords raised a number of issues. I will try to respond to as many of those as I can in the time available, but if there are specific points, as I am sure there will be, that I cannot do justice to now, both my noble friend Lord Ashton and I will of course follow up this debate with a letter. 

A number of noble Lords, including the noble Lord, Lord Kennedy, the noble Baroness, Lady Lane-Fox, and my noble friend Lady Neville-Rolfe, asked whether the Bill was too complex. It was suggested that data controllers would struggle to understand the obligations placed on them and data subjects to understand and access their rights. As the noble Lord, Lord Paddick, said, the Bill is necessarily so, because it provides a complete data protection framework for all personal data. Most data controllers will need to understand only the scheme for general data, allowing them to focus just on Part 2. As now, the Information Commissioner will continue to provide guidance tailored to data controllers and data subjects to help them understand the obligations placed on them and exercise their rights respectively. Indeed, she has already published a number of relevant guidance documents, including—the noble Lord, Lord Kennedy, will be interested to know this—a guide called Preparing for the General Data Protection Regulation (GDPR): 12 Steps to Take Now. It sounds like my type of publication.

Other noble Lords rightly questioned what they saw as unnecessary costs on businesses. My noble friends Lord Arbuthnot and Lady Neville-Rolfe and the noble Lord, Lord Kennedy, expressed concern that the Bill would impose a new layer of unnecessary regulation on businesses—for example, in requiring them to respond to subject access requests. Businesses are currently required to adhere to the Data Protection Act, which makes similar provision. The step up to the new standards should not be a disproportionate burden. Indeed, embracing good cybersecurity and data protection practices will help businesses to win new customers both in the UK and abroad.

A number of noble Lords, including the noble Lord, Lord Jay, asked how the Government would ensure that businesses and criminal justice agencies could continue, uninterrupted, to share data with other member states following the UK’s exit from the EU. The Government published a “future partnership” paper on data protection in August setting out the UK’s position on how to ensure the continued protection and exchange of personal data between the UK and the EU. That drew on the recommendations of the very helpful and timely report of the European Union Committee, to which the noble Lord referred. For example, as set out in the position paper, the Government believe that it would be in our shared interest to agree early to recognise each other’s data protection frameworks as the basis for continued flow of data between the EU and the UK from the point of exit until such time as new and more permanent arrangements came into force. While the final arrangements governing data flows are a matter for the negotiations—I regret that I cannot give a fuller update at this time—I hope that the paper goes some way towards assuring noble Lords of the importance that the Government attach to this issue.

The noble Baroness, Lady Kidron, queried the status of Article 8 of the European Charter of Fundamental Rights, which states:

“Everyone has the right to the protection of personal data concerning him or her”.


The Bill will ensure that the UK continues to provide a world-class standard of data protection both before and after we leave the European Union.

Several noble Lords, including the noble Lord, Lord Paddick, in welcoming the Bill asked whether the Information Commissioner would have the resource she needs to help businesses and others prepare for the GDPR and LED and to ensure that the new legislation is properly enforced, especially once compulsory notification has ended. The Government are committed to ensuring that the Information Commissioner is adequately resourced to fulfil both her current functions under the Data Protection Act 1998 and her new ones. Noble Lords will note that the Bill replicates relevant provisions of the Digital Economy Act 2017, which ensures that the Information Commissioner’s functions in relation to data protection continue to be funded through charges on data controllers. An initial proposal on what those changes might look like is currently being consulted upon. The resulting regulations will rightly be subject to parliamentary scrutiny in due course.

Almost every noble Lord spoke in one way or another about protecting children online, particularly the noble Baroness, Lady Kidron, and the right reverend Prelate the Bishop of Chelmsford, who referred to the Select Committee on Communications report Growing Up with the Internet. The focus of that report was on addressing concerns about the risk to children from the internet. The Government believe that Britain should be the safest place in the world to go online and we are determined to make that a reality. I am happy to confirm that the Government will publish an internet safety strategy Green Paper imminently. This will be an important step forward in tackling this crucial issue. Among other things, the Green Paper will set out plans for an online code of practice that we want to see all social media companies sign up to, and a plan to ensure that every child is taught the skills they need to be safe online.

The other point that was brought up widely, including by the noble Lord, Lord Kennedy, was whether it was appropriate for 13 year-olds to be able to hand over their personal data to social media companies without parental consent. We heard alternative perspectives from my noble friend Lord Arbuthnot and the noble Baroness, Lady Lane-Fox. Addressing the same clause, the right reverend Prelate the Bishop of Chelmsford questioned the extent to which the Government had consulted on this important issue. The noble Baroness, Lady Howe, and the noble Lord, Lord Kennedy, made a similar point. In answer to their specific questions, 170 organisations and numerous individuals responded to the Government’s call for views, published in April, which addressed this issue directly. The Government’s position reflects the responses received. Importantly, it recognises the fundamental role that the internet already plays in the lives of teenagers. While we need to educate children on the risks and to work with internet companies to keep them safe, online platforms and communities provide children and young people with an enormous educational and social resource, as the noble Baroness, Lady Lane-Fox, pointed out. It is not an easy balance to strike, but I am convinced that, in selecting 13, the Government has made the right choice and one fully compatible with the UN Convention on the Rights of the Child, to which the noble Lord, Lord Stevenson, referred.

The noble Baronesses, Lady Jay and Lady Hamwee, stressed the importance of adequate understanding of digital issues, particularly among children. Improving digital skills is a priority of the Government’s digital strategy, published earlier this year. As noble Lords will be aware, the Digital Economy Act created a new statutory entitlement to digitals skills training, which is certainly an important piece of the puzzle. As I have already said, the Government will publish a comprehensive Green Paper on internet safety imminently which will explore further how to develop children’s digital literacy and provide support for parents and carers.

The noble Baroness, Lady Ludford, and the noble Lord, Lord Paddick, I think it was, asked about the Government choosing not to exercise the derogation in article 80 of the GDPR to allow not-for-profit organisations to take action on behalf of data subjects without their consent. This is a very important point. It is important to note that not-for-profit organisations will be able to take action on behalf of data subjects where the individuals concerned have mandated them to do so. This is an important new right for data subjects and should not be underestimated.

The noble Baroness, Lady Manningham-Buller, the noble Lords, Lord Kennedy and Lord Patel, and my noble friend Lady Neville-Jones all expressed concern about the effect that safeguards provided in the Bill might have on certain types of long-term medical research, such as clinical trials and interventional research. My noble friend pointed out that such research can lead to measures or decisions being taken about individuals but it might not be possible to seek their consent in every case. The noble Lord, Lord Patel, raised a number of related issues, including the extent of Clause 7. I assure noble Lords that the Government recognise the importance of these issues. I would be very happy to meet noble Lords and noble Baronesses to discuss them further.

The noble Baroness, Lady Ludford, and the noble Lord, Lord Patel, noted that the Bill is not going to be used to place the National Data Guardian for Health and Social Care on a statutory footing. I assure them that the Government are committed to giving the National Data Guardian statutory force. A Bill to this end was introduced in the House of Commons on 5 September by my honourable friend Peter Bone MP, and the Government look forward to working with him and parliamentary colleagues over the coming months.

My noble friend Lord Arbuthnot and others questioned the breadth of delegated powers provided for in Clause 15, which allows the Secretary of State to use regulations to permit organisations to process personal data in a wider range of circumstances where needed to comply with a legal obligation, to perform a task in the public interest or in the exercise of official authority. Given how quickly technology evolves and the use of data can change, there may be occasions when it is necessary to act relatively quickly to provide organisations with a legal basis for a particular processing operation. The Government believe that the use of regulations, rightly subject to the affirmative procedure, is entirely appropriate to achieve that. But we will of course consider very carefully any recommendations made on this or any other regulation-making power in the Bill by the Delegated Powers and Regulatory Reform Committee, and I look forward to seeing its report in due course.

The noble Viscount, Lord Colville, queried the role of the Information Commissioner in relation to special purposes processing, including in relation to journalism. In keeping with the approach taken in the 1998 Act, the Bill provides for broad exemptions when data is being processed for journalism, where the controller reasonably believes that publication is in the public interest. I reassure noble Lords that the Information Commissioner’s powers, as set out in Clause 164, are tightly focused on compliance with these requirements and not on media conduct more generally. There is a right of appeal to ensure that the commissioner’s determination can be challenged. This is an established process which the Bill simply builds upon.

The noble Lord, Lord Black, questioned the power given to the Information Commissioner to assist a party or prospective party in special purposes proceedings. In this sense, “special purposes” refers to journalistic, literary, artistic or academic purposes. The clause in question, Clause 165, replicates the existing provision in Section 53 of the 1998 Act. It simply reflects the potential public importance of a misuse of the otherwise vital exemptions granted to those processing personal data for special purposes. In practice, I am not aware of the commissioner having provided such assistance but the safeguard is rightly there.

The noble Lord, Lord Janvrin, spoke eloquently about the potential impact of the Bill on museums and archives. The Government agree about the importance of this public function. It is important to note that the Data Protection Act 1998 made no express provision relating to the processing of personal data for archiving purposes. In contrast, the Bill recognises that archives may need to process sensitive personal data, and there is a specific condition to allow for this. The Bill also provides archives with specific exemptions from certain rights of data subjects, such as rights to access and rectify data, where this would prevent them fulfilling their purposes.

The noble Lord, Lord Knight, queried the safeguards in place to prevent the mining of corporate databases for other, perhaps quite distinct, purposes, and the noble Lord, Lord Mitchell, made a similar point. I can reassure them that any use of personal data must comply with the relevant legal requirements. This would include compliance with the necessary data protection principles, including purpose limitation. These principles will be backed by tough new rules on transparency and consent that will ensure that once personal data is obtained for one purpose it cannot generally be used for other purposes without the data subject’s consent.

My noble friend Lord Marlesford raised the desirability of a central system of unique identifying numbers. The Bill will ensure that personal data is collected only for a specific purpose, that it is processed only where there is a legal basis for so doing and that it is always used proportionately. It is not clear to me that setting out to identify everybody in the same way in every context, with all records held centrally, is compatible with these principles. Rather, this Government believe that identity policy is context-specific, that people should be asked to provide only what is necessary, and that only those with a specific need to access data should be able to do so. The Bill is consistent with that vision.

I look forward to exploring all the issues that we have discussed as we move to the next stage. As the Information Commissioner said in her briefing paper, it is vital that the Bill reaches the statute book, and I look forward to working with noble Lords to achieve that as expeditiously as possible. Noble Lords will rightly want to probe the detailed provisions in the Bill and subject them to proper scrutiny, as noble Lords always do, but I am pleased that we can approach this task on the basis of a shared vision; namely, that of a world-leading Data Protection Bill that is good for business, good for the law enforcement community and good for the citizen. I commend the Bill to the House.

Bill read a second time and committed to a Committee of the Whole House.