(5 years, 5 months ago)
Lords ChamberMy Lords, I too thank the Minister for repeating the Statement. This is unfortunate to say the least, and it means these AV requirements will be put in place nearly three years after the original Digital Economy Act was passed. If the Minister does the maths, he will find it has been three years since they were incorporated into the Act.
The noble Lord, Lord Stevenson, asked all the right questions and made a comment about the professionalism of our Civil Service. But I find it staggering that, if you recall, we had exactly the same situation with the Video Recordings Act when notification did not take place. We all had to come back here and re-pass aspects of that Act because that notification had not taken place. I do not understand why that experience was not engraved on every heart in the DCMS or Home Office. I think it was a Home Office requirement at the time, but I dare say the people themselves transferred to the DCMS subsequently. In those circumstances, will compensation be available to companies that have developed age-verification solutions and gone through the voluntary certification and assessment process in anticipation of the guidance going live this July? I would expect nothing less.
During the passage of the Digital Economy Act, we on these Benches agreed in principle with the concept of age verification for pornographic sites for the purposes of child protection, but we wanted greater safeguards in the Bill in terms of third-party verification and privacy. Sadly, that did not happen. My noble friend Lord Paddick and I argued in 2017 for statutory third-party age verification and queried that last year when the regulator was nominated as the BBFC.
What is the current level of voluntary operation of age-verification methods, in response to the guidance or as an independent action? Does any site operate a voluntary age-verification process? If so, are such processes now exclusively third party, which was the essence of our original amendment and why we felt that that was an important privacy aspect? Explicitly, what will be the procedure for the re-approval of the guidance? Will it be by the negative or the affirmative procedure?
My noble friend Lord Paddick argued last year for a much greater commitment to compulsory age-appropriate sex and relationship education for all children, including telling children what they should do if they encounter online pornography. That is an important other side of the coin. What resource is devoted to this increasingly important aspect of sex education? What difference will the new DNS over HTTPS protocol make to the eventual ability of the BBFC to enforce these requirements or to force internet service providers to comply?
The Secretary of State refers in the Statement to the implementation of the online harms White Paper, which is strongly related to the age-verification agenda. The Minister knows that we have reservations about over-hasty legislation; we believe that pre-legislative scrutiny would be wise and would iron out some of the scope and definitional problems. There are conflicting views about the width of the duty of care and, on the other hand, the dangers of being over-prescriptive. There are many voices still to be heard before we can be sure that the legislation will be sound. Is not a draft Bill the way forward?
There is no reason, however, why Ofcom should not be designated early after the end of the consultation—after all, it has the clout, the technological understanding, and experience in regulating content where it converges with technology, in using enforcement and information-gathering powers and in co-operating with other regulators. It could draw up the first code of practice on online safety, mentioned in the Statement.
There is some concern that current policies are driving us into a world where age verification will be required for all kinds of access other than to pornography. That seems to be the implication of the Secretary of State’s remarks about technical challenges associated with identifying the specific age of companies’ users. Is that the intention? We need to be extremely wary of the consequences of that. That must be fully debated before we go further on age-verification requirements.
My Lords, I thank both noble Lords for their sensible comments and repeat our apology. The noble Lord, Lord Stevenson, commented on the memorial service for Lord Heywood and the quality of the Civil Service, so I agree that it is unfortunate that we are today bringing forward this Statement. I want to make it clear that Ministers, not civil servants, are responsible for the department. Both the Secretary of State and I take our responsibilities seriously. I take this opportunity to pay tribute to the civil servants—nearly all the time, though not in this case—and to say that they work extremely hard to protect children. They are absolutely committed and work flat out—I shall come to the online harms White Paper—so the responsibility lies fairly and squarely with Ministers.
The noble Lord, Lord Stevenson, asked how we learned and when. We were informed early last week—on 11 June, I think. A letter from the BBFC was written on 11 June; the Secretary of State was informed on Friday 14 June. Earlier this week he asked civil servants to tell him what the implications were and whether we could do anything to get age verification in place earlier. He then came before the other House today, as soon as possible, to apologise and explain what had happened.
The noble Lord rightly said that we have had experience of the technical standards and regulations directive. The department notified the Act but not the regulations that fell under it. Again, it was a mistake and we are making sure that it will not happen again.
In connection with making sure that it does not happen again, the noble Lord asked about “external elements”. By that, we mean that the review will include people from outside the department to make sure that there is an independent view. I cannot confirm whether it will be published—I will have to go back to the department to ask that.
As for technology, there have been delays. We need to make sure that the technology is effective and that privacy is taken into account. Obviously, the third-party age verifiers are subject to the new privacy law under the GDPR. One reason for the delay was to make sure that the additional voluntary certification scheme is up and running. I say in answer to the noble Lord, Lord Clement-Jones, that sites were expecting to have to be ready to comply with the requirement on 15 July. There has been no voluntary compliance before that; I am not surprised by that. With sites having been prepared to do it on 15 July, we would expect them to bring it in within the timescale of roughly six months—to which I shall come in a minute. We do not anticipate any compensation being paid, because sites were expected to do it on 15 July. They may have a little more time, but our intention is that they should do it as soon as possible. We will bring back the same regulations, because we have to bear in mind that this is about protecting children who accidentally stumble on pornography that they would not be able to stumble on in the offline world. We are concerned to get this in place as soon as possible, which is why we are very disappointed with our mistake.
The broader point made by both noble Lords was that this is a limited measure. We have always acknowledged that; it is for commercial sites. There are other areas in which children can come across pornography, such as social media sites, even though that is not their primary business. That is where online harms will come in. The noble Lord, Lord Clement-Jones, asked about pre-legislative scrutiny. We are very much in a cleft stick here. We of course understand the benefits of pre-legislative scrutiny, but we have to move as quickly as we can to correct some of the problems, particularly in respect of things that are already illegal such as child sexual exploitation and terrorism. However, the noble Lord would not expect me to make a commitment on that when the consultation has not even finished; no doubt, he will respond to the consultation to make his point.
The noble Lord, Lord Clement-Jones, mentioned the Video Recordings Act, where it is true that notification did not place under the old technical services directive. That was 25 years ago, in 1984, and it was corrected in 2010. So the noble Lord is right that there was a similar mistake 25 years ago. We will take measures to ensure that, whether in 25 years, two years, or one year, it will not happen again. I acknowledge that this has happened before, albeit some time ago.
The procedure on the guidance now is that it has to be laid before the EU for three months in draft form. If the EU makes some comments on it, it may have to stay for another month. After that period, it will have to be laid before the House, under the negative procedure, as the House has already agreed. That means we have to allow 40 days for any noble Lord to pray against it. It will take roughly six months to get through both Houses at the end of the up-to-four-month period.
There are several technical issues about the enforceability of the policy—not the policy itself. We also have to take this into account for the online harms White Paper. A suite of enforcement options is available. For example, the regulator can use payment providers and ancillary service providers to enforce the regulations, but these have to come in first and that is what we have had to delay.
(5 years, 6 months ago)
Lords ChamberMy Lords, I thank the noble Baroness for discussing this with me beforehand, which was very welcome. I agree that there may be serious consequences from DoH. The DoH protocol has been defined by the Internet Engineering Task Force. Where I do not agree with the noble Baroness is that this is not an obscure organisation; it has been the dominant internet technical standards organisation for 30-plus years and has attendants from civil society, academia and the UK Government as well as the industry. The proceedings are available online and are not restricted. It is important to know that DoH has not been rolled out yet and the picture is complex—there are pros to DoH as well as cons. We will continue to be part of these discussions; indeed, there was a meeting last week, convened by the NCSC, with DCMS and industry stakeholders present.
My Lords, the noble Baroness has raised a very important issue, and it sounds from the Minister’s Answer as though the Government are somewhat behind the curve on this. When did Ministers actually get to hear about the new encrypted DoH protocol? Does it not risk blowing a very large hole in the Government’s online safety strategy set out in the White Paper?
As I said to the noble Baroness, the Government attend the IETF. The protocol was discussed from October 2017 to October 2018, so it was during that process. As far as the online harms White Paper is concerned, the technology will potentially cause changes in enforcement by online companies, but of course it does not change the duty of care in any way. We will have to look at the alternatives to some of the most dramatic forms of enforcement, which are DNS blocking.
(5 years, 7 months ago)
Lords ChamberMy Lords, we, too, on these Benches welcome the fact that the Government’s proposals have come forward today, and we support the placing of a statutory duty of care on social media companies. We agree that the new arrangements should apply to any sites,
“that allow users to share or discover user-generated content, or interact with each other online”.
We think that is a fair definition.
We are all aware of the benefits of social media networks and the positive role they can play. There is, however, far too much illegal content and harmful activity on social media that goes undealt with by social media platforms and creates social harm. The self-harming material on Instagram and the footage of the Christchurch killings are perhaps the most recent examples.
Proper enforcement of existing laws is, of course, vital to protect users from harm, but, as the White Paper proposes, social media companies should have a statutory duty of care to their users—above all, to children and young people—and, as I say, we fully support the proposed duty of care. It follows that, through the proposed codes, Parliament and Government have an important role to play in defining that duty clearly. We cannot leave it to big private tech firms, such as Facebook and Twitter, to decide the acceptable bounds of conduct and free speech on a purely voluntary basis, as they have been doing to date.
It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly. I welcome the Government’s stated commitment to these two aspects.
We also very much welcome the Government’s adherence to the principle of regulating on a basis of risk and proportionality when enforcing the duty of care and drawing up the codes. Will the codes, as the Lords Communications Committee called for, when exercising powers of oversight, set out clearly the distinction between criminal, harmful content and antisocial content? By the same token, upholding the right to freedom of expression does not mean a laissez-faire approach. Does the Minister agree that bullying and abuse prevent people expressing themselves freely and must be stamped out? Will there be a requirement that users must be able to report harmful or illegal content to platforms and have their reports dealt with appropriately, including being kept informed of the progress and outcome of any complaint?
Similarly, there must be transparency about the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. We welcome the proposed three-month consultation period; indeed, I welcome the Government’s intention to achieve cross-party consensus on the crucial issue of regulating online harms. I agree that with a national consensus we could indeed play an international leadership role in this area.
Then we come to the question of the appropriate regulator to enforce this code and duty. Many of us assumed that this would naturally fall to Ofcom, with its experience and expertise, particularly in upholding freedom of speech. If it is not to be Ofcom, with all its experience, what criteria will be used in determining what new or existing body will be designated? The same appears to me to apply to the question of whether the ICO is the right regulator for the algorithms used by social media. I see that the Home Office will be drawing up certain codes. Who will be responsible for the non-criminal codes? Have the Government considered the proposals by Doteveryone and the Lords Communications Select Committee for a new “Office for Internet Safety” as an advisory body to analyse online harms, identify gaps in regulation and enforcement and recommend new regulations and powers to Parliament?
At the end of the day, regulation alone cannot address all these harms. As the noble Baroness, Lady Kidron, has said, children have the right to a childhood. Schools need to educate children about how to use social media responsibly and be safe online, as advocated by the PSHE Association and strongly supported by my party. Parents must be empowered to protect their children through digital literacy, advice and support. I very much hope that that is what is proposed by the online media literacy strategy.
At the end of the day, we all need to recognise that this kind of regulation can only do so much. We need a change of culture among the social media companies. They should be proactively seeking to prevent harm. The Government refer to a culture of continuous improvement being a desired goal. We on these Benches thoroughly agree that that is vital.
My Lords, I am very grateful for the welcome by both noble Lords for this White Paper. Nevertheless, I am not complacent; I have worked with noble Lords opposite on several big Bills on digital matters and I know there is a lot of detail that will need to be included in the legislation. However, the principle that this is generally welcome and the fact that the main bones of the proposal are welcome—namely, the duty of care and the independent regulator—is good. We have made a point of saying that we want to work on a cross-party, consensual basis and one of the reasons for having an extensive consultation is to achieve that. In some ways, this is an old-fashioned way of making legislation, to the extent that we have had a Green Paper and a consultation, then a White Paper and a consultation: we hope that a lot of the issues can be ironed out, and some of the detail. The way we worked on the Digital Economy Act and the Data Protection Act shows that we can bring in some fairly big and complicated Bills in a consensual way.
The noble Lord, Lord Griffiths, talked about children. They are very important to our thinking. We have not written a specific chapter on the subject because we want it hard-wired throughout the whole White Paper. From the day the regulator is formed, any company in scope will have to say that it is thinking about the customers and users of its products in the design of its website and products means that it will have to, as part of its duty of care, think about the age, vulnerability and sort of people who will use it. That is built into the system.
We thought a lot about the international aspects of regulating the internet, because there is no point having a regulator or enforcement system that cannot cope with the way the internet works, which is, by definition, international. We will therefore think and consult on some of the further sanctions we could put on internet companies, such as individual liability. We might require representatives in the country in the same way as the GDPR does. Ultimately, we are consulting on whether we should take powers to block websites completely. These are, in the main, money-making organisations—Google’s second-largest advertising market is in this country, for example. The internet giants have significant economic stakes in this country, and they could be faced with a very serious penalty.
Above all, we are not expecting the internet companies, large or small, to do anything unreasonable. Some appalling things go on the internet, and the regulator will look at the duty of care—as said in the Statement—as a risk-based and proportionate approach. The big internet giants will be held to a different standard from the small start-ups.
Both noble Lords talked about the regulator. There is a possibility that an existing regulator could either take on this job or create the regulator which may be divested later. We are consulting on that, and would be interested in the views of noble Lords and other stakeholders. It is important to bear in mind that time is of the essence. We want to get on with this. We want to get it right—but we want to get a move on.
The noble Lord, Lord Clement-Jones, talked about some of the harms that are not just illegal. We absolutely agree. In some ways, the harms that are illegal are easy to deal with—they are illegal, and should be so offline as well as online—but things that are not specifically illegal, such as cyberbullying, can have a tremendous effect on people’s lives. We certainly take those into account. The internet companies will have to take a reasonable and balanced approach; they need to show that they are taking seriously harms that can really affect people’s lives, and that they are building their approach to them into the way they operate their companies. Terms and conditions should be met and abided by; there should be a proper complaints procedure, which we will demand be taken seriously, and there will be an appeals process.
The consultation actually started today. We have so far got eight responses. It will go on for three months, after which we will look at it. As I say, noble Lords are very welcome to contribute.
Finally, the noble Lord, Lord Clement-Jones, talked about a change of culture. I think the noble Lord, Lord Griffiths, implied the same thing. The point about this White Paper is that we are moving to a proactive system of regulation where we expect every company, be it large or small, to think in a proportionate way about the harms it could do and to take sensible measures not only to deal with them but to explain to the regulator what it is doing and to have transparent reporting. The regulator will be given powers to inquire of the internet companies what they are doing about these matters.
(5 years, 8 months ago)
Lords ChamberTo ask Her Majesty’s Government what consideration they have given to the standards and certifications required for the algorithms used in decision-taking by public authorities and agencies.
My Lords, last year the Government published the Data Ethics Framework, which sets out clear principles and standards for how data is used in the public sector—an important tool guiding the ethical use of algorithms and AI technologies. The Government have also recently set up the Centre for Data Ethics and Innovation, which will provide independent, expert advice on the governance of data and AI technology. The centre’s first two projects will study the use of data in shaping people’s online experiences and the potential for bias in decisions made using algorithms. This work and the centre’s future work will play a leading role in ensuring transparency and accountability in the ethical use and design of algorithms.
My Lords, some 53 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment and assistance in decision-making. The Centre for Data Ethics and Innovation, for all its virtues, is not a regulator. The Data Ethics Framework does not cover all aspects of algorithms. As the Minister will know, it was quite difficult finding a Minister to respond to this Question. Is it not high time that we appointed a Minister—as recommended by the Commons Science and Technology Committee—who is responsible for making sure that standards are set for algorithm use in local authorities and the public sector and that those standards enforce certain principles such as transparency, fairness, audit and explainability and set up a kitemark so that our citizens are protected?
My Lords, there was no difficulty in finding a Minister in this House: answering the noble Lord’s very sensible Question was pinned on me at a very early stage. The point about the Centre for Data Ethics and Innovation, which will publish its interim report on algorithms in the summer—relatively soon—is that it will look across the whole area and highlight what should be done in regulation terms. It will be one of the things that we expect the centre to look at, so the genuine concerns raised by the noble Lord can be considered at by this forward-looking body.
(5 years, 9 months ago)
Lords ChamberOne of the things we are considering is a duty of care. That might include holding directors personally responsible. We have not decided that yet, but it is certainly an idea worth considering. As it is a White Paper that is coming out this winter, there will be a consultation on it, so we welcome views from my noble friend.
My Lords, the Law Commission, in its scoping report last November into abusive and online communications, said that one of the key barriers to the pursuit of online defenders was,
“tracing and proving the identity of perpetrators, and the cost of doing so”.
I heard what the Minister said about the White Paper’s contents, but will the Government include a provision allowing the stripping of anonymity in circumstances of online crime? Have the Government had any discussions with the police or other enforcement agencies to understand the issues they face in tracking these perpetrators and bringing them to justice?
It is certainly something worth considering in the White Paper, but as far as dealing with the police is concerned, the Home Office is working with policing to identify ways to tackle this when it goes over the threshold into criminality. These are relatively new crimes; the police will have to evolve methods to deal with them. We have also worked with the office of the Director of Public Prosecutions. There is a digital intelligence investigation programme, aiming to ensure policing has the ability to investigate the digital elements of all crime types. Also, the Home Office is working with the College of Policing to drive improvements in overall police capability to investigate and prosecute online offences.
(6 years ago)
Lords ChamberMy Lords, I too thank the Minister for repeating the Statement. He was missed in the debate on Monday. I have had the benefit of reading the Government’s response to the consultation on the Centre for Data Ethics and Innovation. I share the enthusiasm for the centre’s creation, as did the Select Committee, and, now, for the clarification of the centre’s role, which will be very important in ensuring public trust in artificial intelligence. I am also enthusiastic about the appointments—described, as the noble Lord, Lord Stevenson, said, as “stellar” in the Government’s own press release. In particular, I congratulate Members of this House and especially the noble Baroness, Lady Rock, and the right reverend Prelate the Bishop of Oxford, who contributed so much to our AI Select Committee. I am sure that both will keep the flame of our conclusions alive. I am delighted that we will also see a full strategy for the centre emerging early next year.
I too have a few questions for the Minister and I suspect that, in view of the number asked by me and by the noble Lord, Lord Stevenson, he will much prefer to write. Essentially, many of them relate to the relations between the very crowded landscape of regulatory bodies and the government departments involved.
Of course¸ the centre is an interim body. It will eventually be statutory but, as an independent body, where will the accountability lie? To which government department or body will it be accountable? Will it produce its own ethics framework for adoption across a wide range of sectors? Will it advocate such a framework internationally, and through what channels and institutions? Who will advise the Department of Health and Social Care and the NHS on the use of health data in AI applications? Will it be the centre or the ICO, or indeed both? Will the study of bias, which has been announced by the centre, explore the development of audit mechanisms to identify and minimise bias in algorithms?
How will the centre carry out its function of advising the private sector on best practice, such as ethics codes and advisory boards? What links will there be with the Competition and Markets Authority over the question of data monopolies, which I know the Government and the CMA are both conscious of? In their consideration of data trust, will the government Office for Artificial Intelligence, which I see will be the responsible body, also look at the benefits of and incentives for hubs of all things? These are beginning to emerge as a very important way of protecting private data.
What links will there be with other government departments in giving advice on the application of AI and the use of datasets? The noble Lord, Lord Stevenson, referred to lethal autonomous weapons, which emerged as a major issue in our debate on Monday. What kind of regular contact will there be with government departments—in particular, with the Ministry of Defence? One of the big concerns of the Select Committee was: what formal mechanisms for co-ordinating policy and action between the Office for Artificial Intelligence, the AI Council, the Centre for Data Ethics and Innovation and the ICO will there be? That needs to be resolved.
Finally, the centre will have a major role in all the above in its new studies of bias and micro-targeting, and therefore the big question is: will it be adequately resourced? What will its budget be? In the debate on Monday, I said that we need to ensure that we maintain the momentum in developing our national strategy, and this requires government to will the means.
I am tempted to say that I will write, but I will try to answer some of the questions, and I will write regarding some that I do not get around to. I was in at the beginning of the debate on AI and I listened to the noble Lord’s speech.
Not everyone would agree with that, but I did indeed listen to it. I have read that AI is a joint responsibility with BEIS, and my noble friend Lord Henley coped more than adequately, so I do not think that I really was missed.
There was a great deal of support for this innovation—the centre—both in the response to the consultation and, as the noble Lord, Lord Stevenson, said, in proceedings on the then Data Protection Bill, so I am grateful for that today, but I accept the very reasonable questions. On the centre’s independence as it stands now and its statutory establishment, I say that we have deliberately set this up as an advisory body so that it can consider some of the difficult issues that noble Lords have raised. Policy is the Government’s responsibility, so there should not be any confusion about who is held accountable for policy—and it is not the Centre for Data Ethics and Innovation. When this has been established, when we have seen how it has worked and when we have addressed the questions of the crowded space that both noble Lords mentioned, it is our intention to put this on a statutory basis. Then we will see how it has worked in practice. When it comes to putting it on a statutory basis, I have no doubt that there will be lots of back and forth in Committee and things like that on the exact definitions and its exact role.
There are some differences from the Human Fertilisation and Embryology Authority, although of course that was a particularly successful body. One of the main differences was that a lot of those things were considered in advance of the science, if you like, and before the science was put into place. With AI, it is here and now and operating, so we do not have a chance to sit back, think about it in theory and then come up with legislation or regulation. We are dealing with a moving target, so we want to get things going.
As far as I am aware—I will check and write to the noble Lord, Lord Stevenson—the centre has no specific powers to demand information. That is, of course, something that we can look at when it comes to being on a statutory basis.
I am sorry that the application for membership by the noble Lord, Lord Stevenson, was not accepted. There can be only one reason: he spends so much time on the Front Bench that he would not have time, because we expect the directors to spend two to three days a month attending this, so it is a very large work commitment.
As noble Lords will know, the work plan includes two initial projects, which were announced in last year’s Budget: micro-targeting and algorithm bias. We expect the centre, in discussion with the Secretary of State, to come up with a work plan by spring 2019. As the noble Lord, Lord Stevenson, mentioned, there is a tension, if you like, between ethics and innovation, but we are very keen that it consider both because we have to be aware of the potential for innovation, which is constrained in some cases. We would not want a situation where the opportunities for AI for this country are avoided. As the report by the noble Lord, Lord Clement-Jones, made clear, there are tremendous opportunities in this sector. We are aware of the tension, but it is a good tension for the centre to consider.
Both noble Lords talked about the crowded space in this area. We expect the centre to produce memorandums of understanding to outline how it relates to bodies such as the AI Council, which has a slightly different focus and is more about implementation of the AI sector deal than considering the ethics of artificial intelligence. We understand that they need to work together and expect the centre to come back on that.
The noble Lord, Lord Clement-Jones, asked about accountability. The centre will be accountable to the Secretary of State for the DCMS. That is clear. He will agree its work plan. Of course, in terms of independence, once he has established that work plan, what the centre says will not be up to him, so there is independence there. We included in our response that the Government will be expected to reply within six months, so there is a time limit on that. It will apply to all government departments, not just the DCMS. The Ministry of Defence and the department of health have obvious issues and the centre can provide advice to them as well.
The noble Lord, Lord Clement-Jones, asked whether the centre, when it considers bias, would include audit mechanisms. It absolutely might. It is not really for us to say exactly what the centre will consider. In fact, that would be contrary to its independence, having been given the subject to think about. In our response we said some of the things that might be considered, such as audit mechanisms.
There is an obvious issue about competition, which the House of Lords Select Committee mentioned. Work is going on. The Chancellor commissioned the Furman review to look at that and we expect the centre to come up with a discussion on how it will work with the Competition and Markets Authority, but obviously competition is mainly to do with the Competitions and Markets Authority.
At the moment, the body is resourced by the DCMS. In the 2017 Budget, it was provided with £9 million in funding over three years. We expect that to be sufficient but, clearly, we will have to provide adequate resources to do an adequate job.
(6 years, 1 month ago)
Lords ChamberI have outlined that things are moving fast. The consultation finishes on 5 October. Ofcom has said it will report at the beginning of 2019. Then, as the noble Lord, Lord Griffiths, alluded to, it is up to the business managers—if Ofcom decides that legislation is necessary; you will have to look at the report. This is a complex area. The new technologies do not make it simple. It is not just like an old, linear EPG. But we understand the urgency and we know that the commercial interests do make it difficult for public service broadcasters. The key is that we support public service broadcasting.
My Lords, we have heard from my noble friend and other noble Lords about the urgent need to change the EPG regulations, but is there not another aspect? The chief executive of Channel 4 has pointed out that there is no regulation at all of so-called smart voice search controls, which are increasingly being introduced by the major television manufacturers. That aspect is barely covered by the Ofcom report. Will the Minister guarantee that it will be covered in any new regulations?
I accept, as I said before, that this is a complex area. We are talking about not only linear, satellite and aggregators, but about TV and videos which are just on the internet. As noble Lords will know, as well as looking at the prominence regime, we are looking at online harms generally. We expect to publish a White Paper on that in the winter.
(6 years, 4 months ago)
Lords ChamberThe noble Lord is absolutely right. That is a very good example of where this distributed technology could be used, and there are other, similar areas. One of the benefits of this technology, and the fact that it is distributed and everyone has the same copy of the database, is that it builds trust in data, and this is an important area across many departments. I do not know specifically what proofs of concept the Home Office is doing at the moment, but I will certainly take that back to my noble friend the Minister. As I said in my previous answer, there is a cross-governmental officials group and we are currently looking at how best to co-ordinate across government.
My Lords, to take the question from the noble Lord, Lord Harris, a stage further and add to the convivial atmosphere, has not the Government Digital Service fallen behind the times with the development of its Verify digital identity system? It is not regarded as fit for purpose by HMRC, for example. Should we not be creating a single online identity for citizens through distributed ledger technology?
The first question is whether we should be creating a single digital identity, and I defer to the Home Office on that. If that decision was made, whether distributed ledger technology is the right technology for it is, I think, a secondary question.
(6 years, 4 months ago)
Lords ChamberI am very pleased to move seamlessly from the digital part of my brief to sport, and of course I agree with everything my noble friend said.
My Lords, the Minister has put a brave face on it but is it not a fact that, once the Prime Minister had ruled out membership of the digital single market in her Mansion House speech, the chances of reaching an agreement on country of origin principle with a single UK regulator were nil? Does that not mean that it is a question of when—not if—these broadcasters will move their licences, particularly as the Government can give absolutely no certainty, which is what they need?
It is a good thing that the noble Lord is not in charge of our negotiations if he goes in with that attitude. As I tried to point out, there are good reasons for us to continue with a bespoke deal that is to our mutual advantage. I pointed out the fact that our regulation is widely supported around the EU. He asked for certainty; of course there is not 100% certainty, but you never go into a negotiation with that. As we have said, we are preparing a contingency position, just in case the country of origin principle or equivalent is not negotiated.
(6 years, 6 months ago)
Lords ChamberTo ask Her Majesty's Government what assessment they have made of the United Kingdom’s ability to take advantage of the Digital Single Market and of country of origin principles for e-commerce once the United Kingdom leaves the European Union.
My Lords, I am delighted to see that, by including the phrase,
“once the United Kingdom leaves the European Union”,
in his carefully prepared Question, the noble Lord has confirmed from the Liberal Democrat Front Bench that we will be leaving the EU. The UK will not be part of the digital single market once we leave the EU. We are undertaking a comprehensive programme of analytical work looking at the implications of the UK’s exit from the EU. We are seeking input from a wide range of businesses, civil society groups and consumer bodies to inform our future trading agreement negotiations with the EU. This includes e-commerce.
My Lords, recent CEBR estimates put the value of our digital exports in the creative industries alone at £21 billion, yet as the Minister has confirmed and the Prime Minister stated at the Mansion House on 2 March—indeed, the noble Lord, Lord Callanan, repeated it last week—
“the UK will not be part of the EU’s Digital Single Market”.
The Prime Minister went on to say:
“This is a fast evolving, innovative sector, in which the UK is a world leader. So it will be particularly important to have domestic flexibility, to ensure the regulatory environment can always respond nimbly and ambitiously to new developments”.
How on earth will that protect those digital exports? Or is this just another example of the Government whistling in the dark?
My Lords, I completely agree with the noble Lord that the creative industries and digital are a very important part of our economy. We are the leaders in Europe—7.9% of our GDP is digital, with the next biggest, I think, being France, at 3.9%. We acknowledge that this has to be part of the wider negotiations on the single market. We are undertaking a great deal of analysis to make sure that we understand the implications of those negotiations.
(6 years, 7 months ago)
Lords ChamberMy Lords, having immersed myself in the subject of AI for the past year, I am absolutely clear that there is complete cross-party consensus on the potential for AI in the UK. I welcome today’s sector deal, particularly the evidence of cross-departmental working, which underlies quite a lot of the work that is beginning to take place. I very much hope that today’s sector deal is simply the tip of the iceberg of the Government’s AI policy and ambition. I note that the Minister used the word “ambition”, and I very much hope that this is but the first in a number of steps that need to be taken.
I hope we will have a much more extensive debate when the Government’s response to our Select Committee report is issued in due course, because it covers so many aspects. As I see it, today’s sector deal is essentially a nailing down of the commitments made in the industrial strategy, the proposals in the Hall-Pesenti review and the commitments made in the last Budget. I should be very interested if the Minister could unpack how much actual new money is involved in today’s sector deal, because I see it essentially as a packaging up for the sector rather than a new, dramatic development.
There are many aspects of the sector deal to welcome, not least the role of the British Business Bank in helping finance AI developers, growth companies, and so on. I hope they will be given an even more important role in the future, and I hope they will not go the way of the Green Investment Bank, which is an absolute object lesson for the Government in this respect.
The Select Committee thought that the fundamentals of government policy were right but it was a question of scale, ambition, co-ordination and drive behind the policies of the new bodies involved. There are many examples of this. The noble Lord, Lord Stevenson, rightly mentioned infrastructure investment. When only 3% of the country is covered by ultra-fast broadband, a £1 billion investment is neither here nor there. It is a bit of encouragement but it will not move us very fast up the curve compared to our international competitors. Then again, the scale of the skills gap is absolutely huge. I know that there was some negotiation as part of the Hall-Pesenti review, but 200 new PhDs in AI, as mentioned by the noble Lord, Lord Stevenson—off-the-shelf or not—being initially financed is the absolute bare minimum required.
Then again, we are heavily dependent on skilled EU workers. A Brexit brain drain is already threatening the UK tech sector, which relies heavily on foreign talent from the EU. DeepMind is already setting up a laboratory in Paris because of that. We need overseas students to stay. Will the Government reinstate post-study work visas for graduates in STEM subjects who find suitable employment within six months of graduating? The noble Lord, Lord Stevenson, mentioned a doubling of tier 1 visas. That is very welcome but why do not the Government declare, as the Select Committee suggested, a shortage occupation in tier 2 for machine learning and computer skills? That might make a huge difference. Collaborative research with EU countries is at risk as well. How will we fill the gap post 2020?
As virtually every Select Committee witness told us, creative skills will be crucial in the mix as well. What are the Government doing to emphasise not just STEM but STEAM in our schools? There is a dangerous dropping off of arts and creative subjects already. But, of course, it is not simply about the opportunities, of which there are many, but mitigating the risks as well, and making sure that we retain and build public trust in the new technologies involved. Inclusion is of crucial importance in this context. A strong inclusion and diversity agenda ran through our Select Committee report, which has been welcomed. In particular, we need more women in digital roles to help fill the skills gap. What are the Government doing to develop a culture that is inclusive, respectful and encourages women to pursue careers in AI?
Ethics must likewise be moved forward. I hope that the Government move forward quickly with this via the Centre for Data Ethics and Innovation by convening an international conference and other forms of international collaboration. I include the EU in this. Yesterday it published its report, Artificial Intelligence for Europe. In that, the role of the Charter of Fundamental Rights is highlighted as being the instrument by which one could incorporate a code of ethics. This makes the vote on Monday doubly valuable and I hope the Government will take due note. That is a very helpful way of making sure that we have an ethical framework that could cover most European countries.
I could raise many issues, not least data, which the noble Lord, Lord Stevenson, mentioned. I hope the Government will be talking to the Competition and Markets Authority about issues such as data monopolies. I hope that, as the Data Protection Bill goes through the Commons, they will look at whether we have real strength, and whether Article 22 of the GDPR really gives us sufficient rights of explainability for autonomous decision-making, as I raised in this House.
Finally, it is about ambition. If the UK wants to be seen as a world leader in any aspect of AI development, it needs to move as quickly as other countries, such as Canada and France. It must set its ambitions high to be a global player. It must welcome talent in growing its AI industry from start-ups to the next level.
My Lords, I am grateful for the many questions that I have to answer from the two noble Lords. I obviously should start by paying tribute to the committee of the noble Lord, Lord Clement-Jones. There was no reference to it in today’s Statement, and I take it as a compliment that the noble Lord, Lord Stevenson, thinks that DCMS works so quickly that we should include it in the sector deal a mere two or three weeks after it was published. I can say that we very much welcome the report. We thought it was a good piece of work and, in due course, we will provide a response. The report will help to inform actions going forward. It is important to understand that the sector deal today is only the beginning. When the noble Lord talks about the tip of the iceberg, that is very true. There are some things we intend to do, with facilities to make sure that they are monitored properly in the office of AI within the Government. I pay tribute to the noble Lord and his committee for that, and we will certainly look at that carefully.
Both noble Lords spoke of the skills gap. The noble Lord talked about Korea when referring to the 200 new PhDs, but we are not talking about North Korea; we are not just going to create 200 PhDs a year. They are proper PhDs that the Government will fund, leading to 1,000 government-funded extra PhDs by 2025. They are critical for the future but they are not the only areas in skills. The 200 have already been financed and there will be 450 by 2021 and 1,000 by 2025. They are starting in a phase-and-accelerating fashion in numbers per year.
Talking of skills and education, I accept, and have said before, that creativity is important. The Digital Catapult has identified the creative industries as one of the two high-profile potential areas for AI business growth in the UK. We understand that it is not simply a question of computer science, mathematics and such areas. To use the benefit of AI, we need creative minds. The businesses that already exist where we have a leading role in the world, have absolutely accepted that. One of the points of having the AI council is that it will bring together the Government, academia and the sectors to make sure that these points are raised at the highest level.
The noble Lord, Lord Stevenson, talked in particular about digital infrastructure and the commitment to fibre to the premises. We absolutely understand that we are behind many countries in fibre-optic connectivity. What he did not say is that we are ahead of Europe in superfast broadband by a long way, but we absolutely understand that we cannot be complacent. We are moving towards fibre to the premises. That is our goal and we absolutely accept that it needs to be done.
On visas, both noble Lords said that they welcomed the doubling of exceptional talent visas. They are for exceptionally talented people. We need to come to an understanding about the need for the new rules for immigration—luckily my noble friend from the Home Office is sitting here who will be very interested in this. The noble Lord, Lord Stevenson, talked about cross-government work on this, and the noble Lord, Lord Clement Jones, mentioned evidence. Our job is to make sure that the Home Office understands that when we come up with future Immigration Rules—we absolutely understand this is international business—we will need to have the best minds from around the world here. They will be attracted by our leading universities and the opportunities that will exist, and which this sector deal is trying to encourage.
The noble Lord, Lord Clement-Jones, talked about funding. When some of these things are mentioned, how much is actually new funding is a valid point. We have talked about just under £1 billion for this sector deal. Of this, about £600 million is new spending, and £342 million is existing spending that has either been repositioned or is in place already. Of that £600 million of new spending, about £300 million comes from the Government and, very encouragingly, £303 million from industry and the sector. For example, £35 million is from a Japanese venture capital company opening its first European HQ in the UK, £10 million is from Cambridge for the supercomputer, and there are others. About two-thirds is new money.
We absolutely accept that diversity is important, not only because it is the right thing to do, which it is, but because of all the talent we need to go forward. We have introduced the tech talent charter specifically to address that. Three weeks ago, I was at the G7 in Montreal talking about this and it resonated. In fact, we were held up in lights for it. We have 180 firms signed up and aim to have 500 by the end of the year. It is meaningful, and not just motherhood and apple pie about what we wish to do, because one of the things that firms sign up to is providing data centrally on the diversity aspects of their business so that we can compare and see that there is actual and meaningful progress. The charter will give organisations tangible actions and principles that they can adopt to become more gender-diverse.
I think that answers most of the questions. I am grateful for the broad welcome that both noble Lords have given.
(6 years, 7 months ago)
Lords ChamberMy Lords, as I mentioned in my Answer, legislation is coming. The combination of the GDPR, which comes into effect on 25 May, and the Data Protection Bill, which should be in place by then, will make a real difference. Other things need to be done. One of the biggest changes in the last few months has been the acceptance that these social platforms have some responsibility for their content. That does not mean to say that they are publishers as such but Mr Zuckerberg accepted responsibility for content on Facebook. The Prime Minister, in her Davos speech, made much the same point.
My Lords, I wonder if the Minister was as concerned as many of us by the inability of the Information Commissioner to gain access to the premises of Cambridge Analytica for five whole days. It is quite ridiculous that the commissioner should have her hands tied in this way. Will the Government pledge to give the ICO powers of entry similar to those of the competition authorities by an amendment to the Data Protection Bill?
The noble Lord makes a very valid point. We have been talking to the Information Commissioner on exactly the subject of her powers. Report on the Data Protection Bill comes up in the other place soon. I believe that there is widespread sympathy for her point of view, and we are looking at that. If that is the case, and if the House of Commons decides to amend the Bill, I hope that this House will give it a favourable wind when it comes back at ping-pong.
(6 years, 8 months ago)
Lords ChamberThere are mountainous parts of this country that have high-speed broadband. It is a question of getting the infrastructure in place. Broadband availability has gone up from 45% to 95% in seven years because the Government and local authorities, together with private industry, have invested a substantial amount of money.
My Lords, the Minister mentioned full-fibre networks, which could of course deliver ultra-fast broadband but only 3% of consumers have access to them. Eighteen months ago, the Chancellor promised £400 million towards full-fibre networks. How much of that has been spent and how much is expected to be spent in the coming months?
My Lords, the Chancellor announced in November that the local full-fibre network challenge fund was in place, which is part of the Government’s £740 million national productivity investment fund. As I said, the Chancellor announced in the Spring Statement that £95 million has been allocated for 13 different areas. We plan to open the next wave of the challenge fund during this summer.
(6 years, 9 months ago)
Lords ChamberMy Lords, I completely agree with my noble friend. That is why we are establishing the Centre for Data Ethics and Innovation, which will advise on the measures we need to enable and support safe, ethical and ground-breaking innovation in artificial intelligence and other data-related technologies. I remind noble Lords of this House’s Select Committee on Artificial Intelligence, chaired by the noble Lord, Lord Clement-Jones. As for where we are with the centre, the process of appointing a chair for the interim centre is under way and expressions of interest for the role are currently live. More information is available on GOV.UK.
My Lords, I thank the Minister for the earlier namecheck. Thanks to the noble Baroness, Lady Kidron, there will now be a statutory code of practice on age-appropriate website design, which will set standards required of websites on privacy for children. Will the Government make sure that young people and their parents are clearly and effectively told what these standards are at an early date? That is especially important given that the ICO’s draft children and the GDPR guidance has already been overtaken by this major amendment to the Data Protection Bill.
The noble Lord is right to mention the Kidron amendment—I think it is called that now, by universal approval—which the Government are pleased to support. It is early days, to the extent that the Data Protection Bill has not even had its Second Reading in the other place. However, the ICO is aware of what it will be required to do if this amendment remains in place and is working on that. In the meantime, it is concentrating on the GDPR coming into effect on 25 May, and the work that has to be done to get people up to speed before that date.
(6 years, 9 months ago)
Lords ChamberMy Lords, I am grateful to the noble Lords, Lord Stevenson and Lord Clement-Jones. There is a sense of déjà vu from the Digital Economy Act; we are continuing some of the discussions that we had then, and I am happy to do so. However, it is important to bear in mind what we are doing today, which is designating the BBFC. I hope we will come to other issues in the coming weeks. I will get into the definition of “soon” later.
I apologise for interrupting the Minister. Perhaps he can explain why we are not doing this all in one fell swoop. It seems rather bitty. The draft guidance seems to be on the web, and certainly it seems to be all there, so why are we not trying to deal with this in a holistic way?
The answer is that until the regulator is designated, it cannot issue guidance.
We have the government guidance that the Secretary of State has issued. The important issue, which I was going to come to in answering the noble Lord’s question, is that this is a series of steps that involves consultation and then issuing guidance. Until the regulator is designated, it cannot begin to consult or issue guidance. It is a sequential process. There is no question that we want to get on with this; we are not trying to delay it. We are conscious that this needs to be done as soon as possible, and I will come to the steps that might explain that further.
The noble Lord, Lord Clement-Jones, was asking about how the system is going to operate and the level of detail. As I said, the Secretary of State’s guidance to the regulator is there for as and when it is designated, but then the regulator is required to publish its guidance on the age-verification arrangements that it will treat as compliant. So, as I was saying, once the BBFC has been designated, that draft guidance will be laid before Parliament. The noble Lord will be able to raise his objections or queries then, when he has seen the guidance that the regulator itself has made. Until that happens, it cannot either consult or lay the guidance. Parliament can then scrutinise it. That will involve the affirmative procedure in both Houses, so that will be an appropriate point to debate the issues.
We have absolutely understood the need for things like privacy. We understand that it is important to outline those issues and priorities in the Secretary of State’s guidance to the regulator, as and when it is designated. It is then up to the regulator to get into the detail of what it will consider compliant. There is no question that it will choose a particular method. It will set criteria. There will not just be one system, for example; it will make sure that its criteria are clear in the guidance. As I say, we will have a chance to debate that.
The noble Earl, Lord Erroll, talked about when the powers are going to come into force. As I said, we want to do that as quickly as possible. In fact the current Secretary of State said it was his ambition to complete it within a year, although that is going to be difficult. We want to get it right; we want the process of consultation and guidance to be done properly. Of course, there was the small matter of purdah and an election in the way. Now, however, if this House approves the regulator today, we will be well on the way to doing that, and we are definitely trying to do it as quickly as possible.
We take data protection and privacy very seriously. The age verification arrangements should be concerned with verifying only age, not identity; we absolutely agree with that. Providers of age-verification controls will be subject to data protection laws—the GDPR—from 25 May, and the BBFC will work with the Information Commissioner’s Office to ensure that its standards are met by age verification providers, particularly with regard to security, data minimisation and privacy by design. So the ICO is there to uphold the law and enforce data protection law and the GDPR. To go further on that point, the noble Lord, Lord Clement-Jones, mentioned the relationship. The BBFC and the ICO are going to agree a memorandum of understanding to ensure and clarify how they are going to work together and separate their various responsibilities.
I know the noble Lord, Lord Stevenson, is not entirely happy with some of the arrangements; we debated some of them on the Digital Economy Bill. He also mentioned definitions and said one of the things that the regulator—that is, the BBFC if it is designated—will have to do is regulate the definition of extreme pornography that is unlawful even if it has age verification in place. That is not really the subject of debate today. Noble Lords will have an opportunity to discuss that when the regulations come—
(6 years, 9 months ago)
Lords ChamberMy Lords, just to follow on briefly, I am very pleased to see that, as in the Commons, there is a strong Welsh perspective being displayed on these matters today.
We all have a strong interest in sports betting integrity, and we had quite a debate on the issue during our discussion of the Data Protection Bill. I am pleased, therefore, to see the inclusion of UKAD in Part 3 of Schedule 6. In the Commons discussion of this order, there were some interesting debates about the inclusion of international bodies. Perhaps the Minister could slightly unpack the reason for those international bodies being included.
The last thing I want to say is that there is a distinction between Parts 2 and 3 of Schedule 6, and I wonder whether the Minister could explain why UKAD is included in Part 3 but not in Part 2. I know that the Explanatory Memorandum goes into that to some extent, but not entirely. UKAD is an enforcement body, and it seems slightly strange that it is not going to be on the face of the statutory instrument.
My Lords, I am grateful to noble Lords for those questions. I will start with an easy one, that of the noble Baroness, Lady Finlay. The reason we have not talked about football or horseracing today is that they are already on the old schedule, which includes the British Horseracing Authority, the Football Association, the Scottish, Welsh and Irish associations, and FIFA.
The noble Lords, Lord Clement-Jones and Lord Griffiths, asked why an Irish body is included. We are pleased that the UK is home to some international sports bodies and that some of the world’s greatest sports events have been held, and will continue to be held, here. Therefore, it is only right that all relevant international sports bodies, such as the Tennis Integrity Unit, the International Olympic Committee, the International Paralympic Committee and the Commonwealth Games Federation, are listed under Schedule 6. Tackling corruption and protecting the integrity of sport requires a co-ordinated approach at the domestic and international level. We must remember also that the threat faced is often cross-border in nature.
The noble Lord, Lord Clement-Jones, asked about the differences in Parts 2 and 3 of Schedule 6. To be honest, I am not sure what the answer to that is. If it is okay with him, it will be better if I write to him afterwards and get it right.
The Gambling Commission’s statutory objectives include keeping gambling fair, open and free of crime. Millions of bets are placed on sport each day and a great deal of work goes on behind the scenes to ensure that the integrity of betting on sport is maintained. Information sharing plays a central part in preventing corruption, and the order will help promote that. To support this excellent work and maintain the UK’s international standing as a leader in this field, I commend the update to Schedule 6 to the Gambling Act to the House. I am grateful for the support of noble Lords, and I hope that the House feels able to approve it.
(6 years, 9 months ago)
Lords ChamberTo ask Her Majesty’s Government what assessment they have made of the ability of United Kingdom audiovisual services to take advantage of the European Union country of origin rules after Brexit.
My Lords, the broadcasting industry has continuously emphasised the significance of maintaining the country of origin principle. We are committed to working with the sector to ensure that those points are explored and considered as the UK develops its stance on exit negotiations as part of the overall effort to secure the best deal for the UK as a whole. The effect of leaving the EU will depend on the exit negotiations.
My Lords, there are hundreds of channels based here which are broadcast to the EU and get the benefit of a single regulator in the form of Ofcom. The Creative Industries Federation states, in its report today on global trade and Brexit:
“To ensure the UK remains a leading hub for international broadcasters, the continued mutual recognition of broadcasting licences between the UK and EU Member States is imperative”.
Does the Minister agree with that statement, and will the Government treat this as a priority in trade negotiations? Is this not another example of where the straightforward solution would be to stay in the single market?
My Lords, I am very pleased to confirm to the noble Lord that we will treat this as a priority. Of course he is right that the broadcasting industries are one of the UK’s success stories. In fact, 55% of the TV channels based in the UK mainly targeted the European market in 2016, and 53% of the video-on-demand services primarily targeted the EU. It is definitely one of the top priorities of my department, and we communicate regularly with the Department for Exiting the European Union to ensure that it is one of its.
(6 years, 10 months ago)
Lords ChamberMy Lords, in moving that the Bill do now pass, I shall say a few words about it. The Bill has been central to my life and the lives of a number of noble Lords for many weeks now. It was accepted right from the word go as a necessary Bill, and there was almost unanimity about the importance and necessity of getting it in place by next May, taking into account that it still has to go through the other place. I am very relieved to have got to this stage. Despite that unanimity, we have managed to deal with 692 amendments during the passage of the Bill, which is a very good indication of unanimity as far as I am concerned. I have to admit that of those 692, 255 were government amendments, but that is not necessarily a bad thing. The GDPR takes effect in May and many of the things that would have been put into secondary legislation have been dealt with in the Bill. I think most noble Lords would agree that that is a good precedent. Data protection is so pervasive that the previous Data Protection Act, passed 20 years ago in 1998, is referred to around 1,000 times in other legislation, so a lot of the amendments were to make sure that when we repeal that Act and this Bill becomes law it will be consistent with other legislation.
I am very appreciative of what we achieved and the way that we did it. One thing we managed to achieve was to accept a number of recommendations from your Lordships’ House, so we changed the way that universities, schools and colleges can process personal data in respect of alumni relations; we ensured that medical researchers can process necessary personal data they need without any chilling effect; we agreed that patient support groups can process health data; we ensured a fair balance between privacy and the right to freedom of expression when journalists process personal data; and we have talked about insurers today. The noble Baroness, Lady Kidron, one of the heroes of the Bill, helped us protect children online, which we all agreed with—in the end. We amended the way that some of the delegated powers in the Bill are effective and subject to the right parliamentary oversight.
I thank the Front Benches for their co-operation. This is meant to be the last Bill for the noble Lord, Lord Stevenson. I doubt that. Every time he says that, he comes back. He had a good team to help him: the noble Lords, Lord Kennedy and Lord Griffiths of Burry Port. It was the first Bill for the noble Lord, Lord Griffiths; if he can survive this, he can survive anything. I am sure we will see a lot of him in future. I thank the noble Lords, Lord Clement-Jones and Lord Paddick. I should have mentioned the noble Baroness, Lady Hamwee, and acknowledged her position on the privilege amendment. I must say that the way she withdrew her amendments one after the other on Report is a very good precedent for other legislation that might be coming before your Lordships’ House soon.
The Bill team has been mentioned several times, not only today but all through the passage of the Bill. The members of the team have been outstanding. They have worked incredibly hard. I should like to mention Andrew Elliot, the Bill manager, Harry Burt, who worked with him, Jagdeep Sidhu and, from the Home Office, Charles Goldie. They have all done a tremendous job and been great to work with.
Lastly, I have had a galaxy of talent to help me with large parts of the Bill. My noble friends Lady Williams, Lady Chisholm and Lord Young of Cookham and my noble and learned friend Lord Keen have made my life very easy and I am very grateful to them. I beg to move.
My Lords, I will just slip in for a couple of minutes in the light of the Minister’s very shrewd appraisal of the progress on the Bill. I had not quite realised that the Bill team were treating the Digital Economy Bill as a dress rehearsal for the Data Protection Bill, but that is really why this has gone so smoothly, with very much the same cast on the Front Benches.
We on these Benches welcomed many aspects of the Bill on its introduction last October and continue to do so. Indeed, it has improved on the way through, as the Minister pointed out. I thank my noble friends Lord Paddick, Lady Hamwee, Lord McNally, Lady Ludford and Lord Storey for helping to kick the tyres on this Bill so effectively over the last four months. I also thank the noble Lord, Lord Stevenson, and all his colleagues for a generally harmonious collaboration in so many areas of common interest.
I very much thank the Minister and all his colleagues on the Front Bench and the excellent Bill team for all their responses over time to our particular issues. The Minister mentioned a number of areas that have been significant additions to the Bill. I thank the Minister for his good humour throughout, even at late hours and on many complicated areas. We are hugely pleased with the outcome obtained by the campaign of the noble Baroness, Lady Kidron, for age-appropriate design, which many of us on these Benches think is a real game-changer.
There is just a slight sting in the tale. We are less happy with a number of aspects of the Bill, such as, first, the continuing presence of exemptions in paragraph 4 of Schedule 2 for immigration control. Solicitors need the facts to be able to represent their clients, and I am afraid these immigration exceptions will deny access to justice.
Secondly, the Minister made a pretty good fist of explaining the way the new framework for government use of personal data will operate, but I am afraid, in the light of examples given, for instance by the noble Earl, Lord Clancarty, in relation to the Department for Education’s approach to the national pupil database, and now concerns over Public Health England’s release of data on 180,000 patients to a tobacco firm, that there will be continuing concerns about that framework.
Finally, one of the triumphs of debate in this House was the passing of the amendment from the noble Baroness, Lady Hollins, calling for, in effect, Leveson 2. The response of the Secretary of State, whose appointment I very much welcomed at the time, was rather churlish:
“This vote will undermine high quality journalism, fail to resolve challenges the media face and is a hammer blow to local press”.
On Sunday he did even better, saying it could be the “death knell” of democracy, which is pretty strong and unnecessary language. I very much hope that a sensible agreement to proceed is reached before we start having to play ping-pong. I am sorry to have to end on that slightly sour note, but it is an important amendment and I very much hope that it stands.
(6 years, 10 months ago)
Lords ChamberMy Lords, I turn to the new offence of reidentifying de-identified personal data. As a new clause, with no corresponding parallel in the 1998 Act, it has been a hot topic throughout the passage of the Bill and the Government welcome the insightful debates on it that took place in Committee. Those debates have influenced our thinking on aspects of the clause and I will elaborate on the amendments we have tabled in response to concerns raised by noble Lords.
By way of background, Clause162(3) and (4) provide a number of defences for circumstances where reidentification may be lawful, including where it was necessary for the prevention or detection of crime, to comply with a legal obligation, or was otherwise justified as being in the public interest. Further defences are available where the controller responsible for de-identifying the personal data, or the data subjects themselves, consented to its reidentification.
As noble Lords will recall, concerns were raised in Committee that researchers who acted in good faith to test the robustness of an organisation’s de-identification mechanisms may not be adequately protected by the defences in the current clause. Although we continue to believe that the public interest defence would be broad enough to cover this type of activity, we recognise that the perception of a gap in the law may itself be capable of creating harm. We therefore tabled Amendments 151A, 156A and 161A to fix this. These amendments introduce a new, bespoke defence for those for whom reidentification is a product of their testing of the effectiveness of the de-identification systems used by other controllers.
A number of safeguards are included to prevent abuse. I particularly draw noble Lords’ attention to the requirement to notify either the original controller or the Information Commissioner. In addition, the researcher cannot intend to cause, or threaten to cause, damage or distress to a legal person. That means, for example, that those self-styled researchers who attempt to use their discovery to extort money from either the data controller or the data subjects they have reidentified are not protected by this new defence.
We fully appreciate the importance of the work undertaken by legitimate security researchers. I assured noble Lords in Committee that it was in no way our intention to put a halt on this activity where it is done in good faith, and the amendments I am moving today make good on that commitment. On that basis, I beg to move.
My Lords, I thank the Minister. We on these Benches had considerable activity from the academic community, security researchers and so on. I am delighted that the Minister has reflected those concerns with the new amendments.
My Lords, I echo the noble Lord’s words. We also welcome these amendments. As has been said, this issue was raised by the academic community, whose primary concern was that the way the Bill had originally been phrased would make important security research illegal and weaken data protection for everyone by that process. It would also mean that good and valid research going on in our high-quality institutions might be at risk.
I do not in any sense want to question the amendments’ approach, but I have been in further correspondence with academics who have asked us to make a few points. I am looking for a sense that the issues raised are being dealt with. Either a letter or a confirmation that these will be picked up later in the process of the Bill is all that is necessary.
First, it is fairly common-sense to say that companies probably would not be very happy if a researcher picks up that they are not doing what they say on the tin—in other words, if their claim that their data has been anonymised turns out not to be the case. Therefore, proposed new subsection (2)(b) may well be used against researchers to threaten or shut down their work. The wording refers to “distress” that might be caused, but,
“without intending to cause, or threaten to cause, damage or distress to a person”,
seems a particularly weak formulation. If it is only a question of distress, I could be distressed by something quite different from what might distress the noble Lord, who may be more robust about such matters. I think that is a point to take away.
Secondly, we still do not have, despite the way the Minister introduced the amendment, definitions in the Bill that will work in law. “Re-identification”, which is used in the description and is part of the argument around it, is still not defined. Therefore, in proposed new Clause 161A(3), as mentioned by the noble Lord who introduced the amendment, the person who,
“notified the Commissioner or the controller responsible for de-identifying the personal data about the re-identification”,
has to do this,
“without undue delay, and … where feasible, not later than 72 hours after becoming aware of it”.
That is a very tight timetable. Again, I wonder if there might be a bit more elasticity around that. It does say “where feasible”, but it puts rather tight cordon around that.
We are trying to make it safe for researchers and data scientists to report improperly de-identified data, but in the present arrangements the responsibility for doing all this lies with the researcher. We are asking a researcher to go to court, perhaps, and defend themselves, including arguing that they have satisfied Clause 162(2)(a) and (b) and Clause 162(3)(a), (b) and (c), which is a fairly high burden. All in all, we just wonder whether how this has been framed does the trick satisfactorily. I would be grateful for further correspondence with the Minister on this point.
Finally, there is nothing in this amendment about industry. It may not be necessary but it raises a question that has been picked up by a couple of people who have corresponded with us. The burden, again, is on the researcher. Is there not also a need to try to inculcate a culture of transparency in the anonymisation processes which are being carried out in industry? In other words, if there is a duty on researchers to behave properly and do certain things at a certain time, should there not also be a parallel responsibility, for example, on companies to properly and transparently anonymise the data? If there is no duty for them to do it properly, what is in it for them? It may well be that that is just a natural aspect of the work they are doing, but maybe the Government should reflect on whether they are leaving this a little one-sided. I put that to the Minister and hope to get a response in due course.
It absolutely will not and cannot languish, because we are going to put in the Bill—so on a statutory basis—that this has to be reviewed in two years. It will not languish. As I said, if we were just going to kick it into the long grass, I would not have said what I just said, which everyone can read. We would not have put it in the Bill and made the commitments we have made tonight.
My Lords, I thank the Minister for his response and am only sorry that I, rather than the noble Lord, Lord Stevenson, have the privilege of responding. The Minister came back, I thought, very helpfully. The noble Baroness, Lady Kidron, made a superb case for these rights to be implemented earlier rather than later. If we are creating all those new rights for children under the Bill, as she says, we must have a mechanism to enforce them. I believe the Minister said that the review would be two years after the Bill comes into effect. I hope that that is an absolute—
Let us hope that that is treated as an important timetable. I was interested that the Minister expressed his sympathy—I know that that was genuine—but then went on to talk about risks and pitfalls, and very significant developments, which all sounded a bit timid. I understand that we are in relatively novel territory, but it sounded rather timid in the circumstances, especially where the rights of children are concerned.
One point the Minister did come back on was group litigation orders. Class actions are very different from the kinds of representative action that we are talking about under these amendments. For example, they would be anonymous and the consent of the data subject would not have had to be acquired, unlike with a class action. They are very different, which is worth pointing out. There are some egregious issues in terms of the use of people’s data—the Equifax case, Uber, and so on. We need to remind ourselves that these are really important data breaches and there need to be remedies available. We, on this side of the House, and those on the Benches of the noble Baroness, Lady Kidron, will be vigilant on this aspect.
The one area of clarification that I did not receive from the Minister was whether this would apply to processing of personal data that was not under the GDPR. Will it be under the applied GDPR, and would that apply?
I think it applies to the whole thing, but if I am wrong, I will certainly write to everyone who is here.
My Lords, I am grateful to all those who have participated. I take on board what the noble Lord, Lord Clement-Jones, said about our brief debate on the final day in Committee, so we can do a bit tonight. I hope that by the end I will be able to convince noble Lords that this is not quite as sinister as has been made out. I am going to duck, if I may, the argument about the affirmative procedure and whether it should be amendable, particularly given other Bills that are coming before this House soon. After all, I was only reappointed yesterday.
It is helpful to have this opportunity to further set out the purpose and operation of Clauses 175 to 178 and, in doing so, explain why the amendments in this group are unnecessary—except, of course, the government amendments. As noble Lords will now be aware, the Bill creates a comprehensive and modern scheme for data protection in the UK. No one is above the law, including the Government. That partly answers the point made by the noble Lord, Lord Clement-Jones. The Secretary of State cannot do whatever she or he wants because they are subject to the GDPR and the Bill, like everyone else. When I go further and explain the relationship between this framework and the ICO’s guidance, if it is issued, I hope that will further reassure noble Lords.
While we are on this subject, the reason the Bill uses the term “framework” is that it uses the term “code of practice” to refer to a number of documents produced by the Information Commissioner. As this document will be produced by the Government, we felt that it would be clearer not to use that term in this case. It is purely a question of naming conventions—nothing significant at all.
Inherent in the execution of the Government’s functions is a requirement to process significant volumes of personal data, whether in issuing a passport or providing information on vulnerable persons to the social services departments of local authorities. The Government recognise the strong public interest in understanding better how they process that data. The framework is therefore intended to set out the principles and processes that the Government must have regard to when processing personal data. Government departments will be required to have regard to the framework when processing personal data. This is not a novel concept. Across the country, organisations and businesses produce guidance on data processing that addresses the specific circumstances relevant to them or the sector in which they operate. This sector, or organisation-specific guidance, coexists with the overarching guidance provided by the Information Commissioner.
This framework adopts a similar approach; it is the Government producing guidance on their own processing of data. The Information Commissioner was consulted during the preparation of these clauses and will be consulted during the preparation of the framework itself to ensure that the framework complements the commissioner’s high-level national guidance when setting out more detailed provision for government.
My Lords, the Minister said that the Information Commissioner was consulted, but what was her view? Can the Minister put on record what the Information Commissioner’s view about the final architecture was? She has made it fairly clear to us that this is not satisfactory, as far as she is concerned.
When I said that she was consulted, I said what I meant. This is one of the few areas in the whole Bill, I think, where we do not have complete agreement with the Information Commissioner. I think that she is worried about complications regarding independence and the extent of her authority in this. I am not pretending that she is completely happy with this, but I hope that I will address how the two interlink and we can come back to this if the noble Lord wants. I acknowledge his point that she is not completely happy with this but, as I said before, it is one of the few areas in the whole Bill where that is the case. Certainly, we have a very good relationship with the Information Commissioner, as evidenced earlier this evening by her agreement on pay and flexibility. Importantly though, whatever she thinks of it, she will be consulted during the preparation of the framework itself to ensure that it complements the commissioner’s high-level national guidance when setting out more detailed provision for the Government.
As I explained in Committee, the Government’s view is that the framework will serve to further improve the transparency and clarity of existing government data processing. The Government can and should lead by example on data protection. Amendment 176 is designed to address concerns about the potential for confusion if the framework is produced by the Government, I respectfully suggest that these concerns are misplaced. The Secretary of State’s framework will set out principles for the specific context of data processing by government. It will, as I have set out, complement rather than supplant the commissioner’s statutory codes of practice and guidance, which will, by necessity, be high level and general as they will apply to any number of sectors and organisations.
Requiring the commissioner to dedicate time and resources to producing guidance specifically for the Government, as the noble Lord’s amendment would require, would hardly seem to the best use of her resources. Just like a sectoral representative body, it is the Government who have the experience and knowledge to devise a framework that speaks to their own context in more specific terms.
I am sorry to keep interrupting the Minister, but is he therefore saying that the frameworks cover government and that the ICO’s codes of practice cover government as well?
Absolutely. The framework exists like other sectoral guidance that is produced, under the overarching guidance produced by the Information Commissioner. In a minute I will provide further reassurance on how the two interlink.
As I have already set out, the Government will consult the commissioner in preparing the framework. Importantly, she is free to disregard the Government’s framework wherever she considers it irrelevant or to disagree with its contents.
My Lords, we are at the last knockings on most of the Bill. It is rather ironic that one of the most important concepts that we need to establish is a new data ethics body—a new stewardship body—called for by the Government in their manifesto, by the Royal Society, by the British Academy and by many others. Many of those who gave evidence to our Select Committee want to see an overarching body of the kind that is set out, and with a code of ethics to go with it. We all heard what the Minister had to say last time; we hope that he can perhaps give us more of an update on the work being carried out in this area.
This should not be and I do not think it will be a matter of party contention; I think there will be a great deal of consensus on the need to have this kind of body, not just for the narrow field of data protection and the use of data but generally, for the wider application in the whole field, whether it is the internet of things or artificial intelligence, and so on. There is therefore a desire to see progress in fairly short order in this kind of area. One of the reasons for that is precisely because of the power of the tech majors. We want to see a much more muscular approach to the use of data by those tech majors. It is coming down the track in all sorts of different varieties. We have seen it in debates in this House; no doubt there will be a discussion tomorrow about social media platforms and their use of news and content and so on. This is therefore a live issue, and I very much hope that the Minister will be able to tell us that the new Secretary of State is dynamically taking this forward as one of the top items on his agenda.
My Lords, I can certainly confirm that the new Secretary of State is dynamic. In this group we are in danger of violently agreeing with each other. There is a definite consensus on the need for this; whether there will be consensus on the results is another matter. I agree with the analysis given by the noble Lord, Lord Stevenson, that the trouble is that to get this into the Bill, we have to concentrate on data. As the noble Lord, Lord Clement-Jones, outlined, many other things need to be included in this grouping, not least artificial intelligence.
I will briefly outline what we would like to do. For the record, we understand that the use of data and the data-enabled technologies is transforming our society at unprecedented speed. We should expect artificial intelligence and machine learning to inform ever more aspects of our life in increasingly important ways. These new advances have the potential to deliver enormous benefits to society and the economy but, as we are made aware on a daily basis—like the noble Lord, Lord Clement-Jones, I am sure that this will be raised tomorrow in the debate that we are all looking forward to on social media—they are also raising a host of new and profoundly important challenges that we need to consider. One of those challenges, and the focus of this Bill, is protecting people’s personal data—ensuring that it is collected, retained and used appropriately. However, the other challenges and opportunities raised by these technologies go far beyond that, and there are many examples that I could give.
Therefore, in the Autumn Budget the Government announced their intention to create a centre for data ethics and innovation to maximise the benefits of AI and data technologies to society and the economy, and to help identify and address the ethical challenges that they pose. The centre will advise the Government and regulators on how they can strengthen and improve the way that data and artificial intelligence are governed. It will also support the effective, innovative and ethical use of data and artificial intelligence so that we maximise the positive impact that these technologies can have on our economy and society.
We are in the process of working up the centre’s terms of reference in more detail and will consult on this soon. The issues it will consider are pressing, and we intend to set it up in an interim form as soon as possible, in parallel to this consultation. However, I fully share the noble Lord’s view that the centre, whatever its precise form, should be placed on a statutory footing, and I can commit that we will bring forward appropriate legislation to do so at the earliest opportunity. I accept the reasoning from the noble Lord, Lord Stevenson, on why this is not the appropriate place due to the limitations of this Bill, and I therefore hope that he will be able to withdraw his amendment.
(6 years, 10 months ago)
Lords ChamberWe were going to have a debate on that—I gather that the Liberal Democrats did not want to bring it forward—but the basic answer is that schools have responsibilities under the GDPR. They particularly have responsibility for personal data relating to children; they already have extensive responsibilities under the current Data Protection Act. So it is very much an issue for schools. In this case, to help them, the Department for Education is going to provide guidance—and I am assured that it will be out very soon. So they have particular responsibilities. The kind of personal data that they handle on a regular basis is very important; I believe that the noble Lord, Lord Clement-Jones, mentioned an example of some of the personal data that they hold in relation to free school meals, which has to be protected and looked after carefully. One benefit for the school system, as far as other organisations are concerned, is that they will have central guidance from the Department for Education—and I repeat that that is due to come out very soon.
I turn to Amendment 125, also proposed by my noble friend. It seeks to introduce a requirement on the Secretary of State, when making regulations under Clause 132, to consider making provision for a discounted charge—or no charge at all—to be payable by small businesses, small charities and parish councils to the Information Commissioner. Clause 132(3) already allows the Secretary of State to make provision for cases in which a discounted charge or no charge is payable. The new charge structure will take account of the need not to impose additional burdens on small businesses. This may include a provision in relation to small organisations.
I am happy to confirm that the Government have given very serious consideration to the appropriate charges for smaller businesses as part of the broader process for setting the Information Commissioner’s 2018 charges. The new charge structure will take account of the need to not impose additional burdens on small businesses. It is important to note, however, that small and medium organisations form a significant proportion of the data controllers currently registered with the ICO—approximately 99%, in fact. The process of determining a new charge structure is nearly complete and we will bring forward the resulting statutory instrument shortly. I would, however, like to put one thing on the record: in putting together that charging regime, we have been mindful of the need to ensure that the Information Commissioner is adequately resourced during this crucial transitional period, but I want to be clear that the Government do not consider the 2018 charges to be the end of the story. There may well be more we can do further down the line to modernise a regime that has not been touched for the best part of a decade.
Amendment 127 would place an obligation on the commissioner, in her annual report to Parliament, to include an economic assessment of the actions that the commissioner has taken on small businesses, charities and parish councils. I agree with my noble friend about the importance of the commissioner being aware of the impact of her approach to regulation during this crucial period. As I said to the commissioner when we met, we must nevertheless also be mindful of maintaining her independence in selecting an approach. Even if we did not think that having an independent regulator was important—I want to be clear: we do —articles 51 to 59 of the GDPR impose a series of particular requirements in that regard. But, all of the above notwithstanding, I agree with a lot of what my noble friend has said this afternoon.
Turning to amendment 107A, in the name of the noble Lord, Lord Clement-Jones, concerning the registration of data controllers, I remember the Committee debate where the noble Lord tabled a similar amendment. I hope that I can use this opportunity to provide further reassurance that it is unnecessary. The Government replaced the existing notification system with a new system of charges payable by data controllers in the Digital Economy Act. We did this for two reasons. First, the new GDPR has done away with the need for notification. Secondly, and consequentially, we needed a replacement system to fund the important work of the Information Commissioner. All this Bill does is re-enact what was done and agreed in the Digital Economy Act last year. We legislated on this a year earlier than the GDPR would come into force because changes to fees and charges need more of a lead time to take effect. As I have already said, these new charges must be in place by the time the GDPR takes effect in May and we will shortly be laying regulations before Parliament which set those fees.
Returning to the subject matter of the amendment, under the current data protection law, notification, accompanied by a charge, is the first step to compliance. Similarly, under the new law, a charge will also need to be paid and, as under the previous law, failure to pay the charge is enforceable. We have replaced the unwieldy criminal sanction with a new penalty scheme—found in Clause 151 of the Bill.
My Lords, can the Minister explain what the trigger is for the payment of the fees?
That is not what I meant. That is not a trigger; it is notification by the data controller.
If you process and control data, you will need to make a notification to the data commissioner. I do not understand why that is not a trigger.
Exactly, so my point, which I was coming to but which the noble Lord has very carefully made for me, is that, in doing this, the Information Commissioner will obviously keep a list of the names and addresses of those people who have paid the charge. The noble Lord may even want to call that a register. The difference is, unlike the previous register, it will not have all the details included in the previous one. That was fine in 1998, and had some benefit, but the Information Commissioner finds it extremely time-consuming to maintain this. In addition, as regards the information required in the existing register, under the GDPR that now has to be notified to the data subjects anyway. Therefore, if the noble Lord wants to think of this list of people who have paid the charge as a register, he may feel happier.
I have talked about the penalty sanction. When the noble Lord interrupted me, I was just about to say—I will repeat it—that the commissioner will maintain a database of those who have paid the new charge, and will use the charge income to fund her operation. So what has changed? The main change is that the same benefits of the old scheme are achieved with less burden on business and less unnecessary administration for the commissioner. The current scheme is cumbersome, demanding lots of information from the data processors and controllers, and for the commissioner, and it demands regular updates. It had a place in 1998 and was introduced then to support the proper implementation of data protection law in the UK. However, in the past two decades, the use of data in our society has changed dramatically. In our digital age, in which an ever-increasing amount of data is being processed, data controllers find this process unwieldy. It takes longer and longer to complete the forms and updates are needed more and more often, and the commissioner herself tells us that she has limited use for this information.
My hope is that Amendment 107A is born out of a feeling shared by many, which is to a certain extent one of confusion. I hope that with this explanation the situation is now clearer. When we lay the charges regulations shortly, it will, I hope, become clearer still. The amendment would simply create unnecessary red tape and may even be incompatible with the GDPR as it would institute a register which is not required by the GDPR. I am sure that cannot be the noble Lord’s intention. For all those reasons, I hope he will withdraw the amendment.
(6 years, 11 months ago)
Lords ChamberMy Lords, I cannot think of a better way to end our debate than with a discussion on recitals, which we have talked about a lot during the course of this Bill. I point out to both noble Lords that it was not only me who referred to recitals; they have both done so ad nauseam.
Sorry, I should have said “ad infinitum”—that is perfectly correct.
The Government do not dispute that recitals form an important part of the GDPR. As I said, we have all referred to one recital or another many times. There is nothing embarrassing or awkward about that. It is a fact of EU law that courts often require assistance in properly interpreting the articles of a directly applicable regulation—and we, as parliamentarians, need to follow that logic, too.
I would remind noble Lords that the Government have been clear that the European Union (Withdrawal) Bill will be used to deliver two things which are very important in this context. First, under Clause 3 of the withdrawal Bill, recitals of directly applicable regulations will be transferred into UK law at the same time as the articles are transferred. There is no risk of them somehow being cast adrift. Where legislation is converted under this clause, it is the text of the legislation itself which will form part of domestic legislation. This will include the full text of any EU instrument, including its recitals.
Secondly, Clause 6 of the withdrawal Bill ensures that recitals will continue to be interpreted as they were prior to the UK’s exit from the EU. They will, as before, be capable of casting light on the interpretation to be given to a legal rule, but they will not themselves have the status of a substantive legal rule. Clause 20(5) of this Bill ensures that whatever is true for the interpretation of the GDPR proper is also true for the applied GDPR.
More than 10,000 regulations are currently in force in the European Union. Some are more important than others but, however you look at it, there must be more than 100,000 recitals across the piece. The European Union (Withdrawal) Bill provides a consistent solution for every single one of them. It seems odd that we would want to use this Bill to highlight the status of 0.1% of them. Nor, as I say, is there a need to: Clause 20 already ensures that the applied GDPR will be interpreted consistently with the GDPR, which means that it will be interpreted in accordance with the GDPR’s recitals wherever relevant, both before and after exit.
There is one further risk that I must draw to the House’s attention. Recitals are not the only interpretive aid available to the courts. Other sources, such as case law or definitions of terms in other EU legislation, may also be valid depending on the circumstances. Clause 20(5) as drafted provides for all interpretive aids to the GDPR to apply to the applied GDPR. By singling out recitals the amendment could uniquely elevate their status in the context of the applied GDPR above any other similar aids. This, in turn, may cause the GDPR and applied GDPR to diverge.
The drafting of the noble Lord’s amendment is also rather perplexing. It seeks to affect only the interpretation of the applied GDPR. The applied GDPR is an important part of the Bill but it is relatively narrow in its application. I am not sure it has the importance that the noble Lord’s amendment seeks to attach to it. It is, at most, a template for what will follow post exit.
I will not stand here and say that the noble Lord’s amendment would be the end of the world. That would be disingenuous. However, it is unnecessary, it risks unintended consequences and it does not achieve what the noble Lord is, I think, attempting. For those reasons, I am afraid I am unable to support his amendment this evening and I ask him to withdraw it.
(6 years, 11 months ago)
Lords ChamberI may have to add later to what I have said, which I think the Minister will find totally unpalatable. I will try to move on.
The Minister also said:
“You are concerned that if consent is not a genuine option in these situations and there are no specific processing conditions in the Bill to cover this on grounds of substantial public interest. Processing in these circumstances would be unlawful. To make their consent GDPR compliant, an employer or school must provide a reasonable alternative that achieves the same ends, for example, offering ‘manual’ entry by way of a reception desk”.
Consent is rarely valid in an employment context. If an employer believes that certain premises require higher levels of security, and that biometric access controls are a necessary and proportionate solution, it cannot be optional with alternative mechanisms that are less secure, as that undermines the security reasons for needing the higher levels of security in the first place: for example, where an employer secures a specific office or where the staff are working on highly sensitive or confidential matters, or where the employer secures a specific room in an office, such as a server room, where only a small number of people can have access and the access needs to be more secure.
Biometrics are unique to each person. A pass card can easily be lost or passed to someone else. It is not feasible or practical to insist that organisations employ extra staff for each secure office or secure room to act as security guards to manually let people in.
The Minister further stated:
“You also queried whether researchers involved in improving the reliability or ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. Article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing that appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards of clause 18 of the Bill. For the purposes of GDPR, ‘scientific research’ has a broad meaning. When taken together with the obvious possibility of consent-based research, we are confident that the Bill allows for the general type of testing you have described”.
It is good to hear that the Government interpret the research provisions as being broad enough to accommodate the research and development described. However, for organisations to use these provisions with confidence, they need to know whether the ICO and courts will take the same broad view.
There are other amendments which would broaden the understanding of the research definition, which no doubt the Minister will speak to and which the Government could support to leave no room for doubt for organisations. However, it is inaccurate to assume that all R&D will be consent based; in fact, very little of it will be. Given the need for consent to be a genuine choice to be valid, organisations can rarely rely on this as they need a minimum amount of reliable data for R&D that presents a representative sample for whatever they are doing. That is undermined by allowing individuals to opt in and out whenever they choose. In particular, for machine learning and AI, there is a danger of discrimination and bias if R&D has incomplete datasets and data that does not accurately represent the population. There have already been cases of poor facial recognition programmes in other parts of the world that do not recognise certain races because the input data did not contain sufficient samples of that particular ethnicity with which to train the model.
This is even more the case where the biometric data for research and development is for the purpose of improving systems to improve security. Those employing security and fraud prevention measures have constantly to evaluate and improve their systems to stay one step ahead of those with malicious intent. The data required for this needs to be guaranteed and not left to chance by allowing individuals to choose. The research and development to improve the system is an integral aspect of providing the system in the first place.
I hope that the Minister recognises some of those statements that he made in his letter and will be able, at least to some degree, to respond to the points that I have made. There has been some toing and froing, so I think that he is pretty well aware of the points being raised. Even if he cannot accept these amendments, I hope that he can at least indicate that biometrics is the subject of live attention within his department and that work will be ongoing to find a solution to some of the issues that I have raised. I beg to move.
My Lords, I wonder whether I might use this opportunity to ask a very short question regarding the definition of biometric data and, in doing so, support my noble friend. The definition in Clause 188 is the same as in the GDPR and includes reference to “behavioural characteristics”. It states that,
“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data”.
Well:
“There’s no art
To find the mind’s construction in the face”.
How do behavioural characteristics work in this context? The Minister may not want to reply to that now, but I would be grateful for an answer at some point.
My Lords, I thank the noble Lord, Lord Clement-Jones, for engaging constructively on this subject since we discussed it in Committee. I know that he is keen for data controllers to have clarity on the circumstances in which the processing of biometric data would be lawful. I recognise that the points he makes are of the moment: my department is aware of these issues and will keep an eye on them, even though we do not want to accept his amendments today.
To reiterate some of the points I made in my letter so generously quoted by the noble Lord, the GDPR regards biometric data as a “special category” of data due to its sensitivity. In order to process such data, a data controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data is lawful is to seek the explicit consent of the data subject. However, the GDPR acknowledges that there might be occasions where consent is not possible. Schedule 1 to the Bill makes provision for a range of issues of substantial public interest: for example, paragraph 8, which permits processing such as the prevention or detection of an unlawful act. My letter to noble Lords following day two in Committee went into more detail on this point.
The noble Lord covered much of what I am going to say about businesses such as banks making use of biometric identification verification mechanisms. Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should have no difficulty in seeking the data subject’s free and informed consent, but I take the point that obtaining proper, GDPR-compliant consent is more difficult when, for example, the controller is the data subject’s employer. I have considered this issue carefully following our discussion in Committee, but I remain of the view that there is not yet a compelling case to add new exemptions for controllers who wish to process sensitive biometric data without the consent of data subjects. The Bill and the GDPR make consent pre-eminent wherever possible. If that means employers who wish to install biometric systems have to ensure that they also offer a reasonable alternative to those who do not want their biometric data to be held on file, then so be it.
There is legislative precedent for this principle. Section 26 of the Protection of Freedoms Act 2012 requires state schools to seek parental consent before processing biometric data and to provide a reasonable alternative mechanism if consent is not given or is withdrawn. I might refer the noble Lord to any number of speeches given by members of his own party—the noble Baroness, Lady Hamwee, for example—on the importance of those provisions. After all, imposing a legislative requirement for consent was a 2010 Liberal Democrat manifesto commitment. The GDPR merely extends that principle to bodies other than schools. The noble Lord might respond that his amendment’s proposed subsection (1) is intended to permit processing only in a tight set of circumstances where processing of biometric data is undertaken out of necessity. To which I would ask: when is it genuinely necessary to secure premises or authenticate individuals using biometrics, rather than just cheaper or more convenient?
We also have very significant concerns with the noble Lord’s subsections (4) and (5), which seek to drive a coach and horses through fundamental provisions of the GDPR—purpose limitation and storage limitation, in particular. The GDPR does not in fact allow member states to derogate from article 5(1)(e), so subsection (5) would represent a clear breach of European law.
For completeness, I should also mention concerns raised about whether researchers involved in improving the reliability of ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. I reassure noble Lords, as I did in Committee, that article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards in Clause 18 of the Bill. Whatever your opinion of recitals and their ultimate resting place, recital 159 is clear that the term “scientific research” should be interpreted,
“in a broad manner including for example technological development and demonstration”.
This is a fast-moving area where the use of such technology is likely to increase over the next few years, so I take the point of the noble Lord, Lord Clement-Jones, that this is an area that needs to be watched. That is partly why Clause 9(6) provides a delegated power to add further processing conditions in the substantial public interest if new technologies, or applications of existing technologies, emerge. That would allow us to make any changes that are needed in the future, following further consultation with the parties that are likely to be affected by the proposals, both data controllers and, importantly, data subjects whose sensitive personal data is at stake. For those reasons, I hope the noble Lord is persuaded that there are good reasons for not proceeding with his amendment at the moment.
The noble Baroness, Lady Hamwee, asked about behavioural issues. I had hoped that I might get some inspiration, but I fear I have not, so I will get back to her and explain all about behavioural characteristics.
My Lords, I realise that, ahead of the dinner break business, the House is agog at details of the Data Protection Bill, so I will not prolong the matter. The Minister said that things are fast-moving, but I do not think the Government are moving at the pace of the slowest in the convoy on this issue. We are already here. The Minister says it is right that we should have alternatives, but for a lab that wants facial recognition techniques, having alternatives is just not practical. The Government are going to have to rethink this, particularly in the employment area. As more and more banks require it as part of their identification techniques, it will become of great importance.
We are just around the corner from these things, so I urge the Minister, during the passage of the Bill, to look again at whether there are at least some obvious issues that could be dealt with. I accept that some areas may be equivocal at this point, only we are not really talking about the future but the present. I understand what the Minister says and I will read his remarks very carefully, as no doubt will the industry that increasingly uses and wants to use biometrics. In the meantime, I beg leave to withdraw the amendment.
(6 years, 11 months ago)
Grand CommitteeMy Lords, I intend to be brief. Noble Lords will recall that the Digital Economy Act 2017 received Royal Assent in April this year. That Act included reforms to the Electronic Communications Code, which provides the statutory framework for agreements between site providers and digital communications network operators.
The purpose of the reforms is to make it easier and cheaper for digital communications infrastructure to be installed and maintained, ensuring that this statutory framework supports the wider benefits of the UK’s world-leading digital communications services. The reformed code is subject to commencement by a separate statutory instrument, which will not require parliamentary scrutiny. We expect to bring the code into force by the end of December. However, before taking this step, we need to ensure that a number of sets of supporting regulations are in place.
In addition to the regulations before the Committee today, the supporting measures include two sets of regulations that were laid on 19 October 2017 under the negative procedure, which amend secondary legislation and make specific transitional provisions. Together, the purpose of all these regulations is to ensure a smooth transition from the existing legislation to the new code. They will therefore take effect only when the new code commences, which, as I mentioned, we expect to be by the end of December.
The draft Communications Act 2003 and the Digital Economy Act 2017 (Consequential Amendments to Primary Legislation) Regulations 2017 amend references in other primary legislation to the existing code and to provisions in the existing code, replacing them with terminology and cross-referencing aligned to the new code.
The draft Electronic Communications Code (Jurisdiction) Regulations 2017 bring into effect one of the code’s key reforms: transferring the jurisdiction for code disputes from the county courts to the Lands Tribunal in England and Wales, and from the sheriff court to the Lands Tribunal in Scotland. This reform was strongly recommended by the Law Commission following its consultation on the code, and is expected to ensure that code disputes can be dealt with more quickly and efficiently. The DCMS has worked closely with colleagues in the Ministry of Justice, and their counterparts in Scotland, to prepare for this change. I beg to move.
My Lords, the Minister has reminded us of our happy days during the passage of the Digital Economy Bill—now the Digital Economy Act. Of course, we all like to be reminded of our days in the salt mines. These regulations are straightforward and we welcome them. I certainly do not intend to raise again any issues relating to the Electronic Communications Code. Certainly, I would not want to provoke another speech from the noble Lord, Lord Grantchester; that would be very unwise.
However, I will make a couple of comments relating to the implementation of the code. As I understand it, Ofcom is issuing a code of practice on top of that. There is some concern that although the direction of travel of the ECC was very clear, the code of practice is in a sense bringing back a slight bias in favour of the landowners. That is a concern of some commentators. One says:
“While the consultation around the code of practice is to be welcomed, if implemented in its current form, the code of practice is in danger of swinging the pendulum back too far in favour of landowners who will be able to challenge operators at every stage”.
I know that the Government were very keen to get the balance right. It will be interesting to hear what the Minister has to say about that.
The Minister may want to write to me about this, but this is a useful opportunity to ask about the direction of government policy in terms of EU regulatory reforms—if we can bear it. It looks like there are plans from Brussels for a new Electronic Communications Code which includes e-privacy regulation. Obviously, before we exit—if we exit—it will continue to be important to keep the digital single market and the single telecoms market in place. The question arises: will there be time? Will the new Electronic Communications Code, however it is brought in—whether by directive or regulation, I am not quite sure—happen? Will it fall outside? Will it be after 29 March? Will it fall during a transition period? I suspect there are many in the telecoms field and the general area of technology infrastructure who will be extremely interested in the answer to that.
Those are the two areas on which I would very much like to have an answer from the Minister, either now or at some stage in the future.
(7 years ago)
Lords ChamberMy Lords, I am grateful to all noble Lords who have spoken on this very important clause. I agree very much with the noble Lords, Lord Clement-Jones and Lord Stevenson, that these are important issues which we need to consider. The amendments seek to amend Clause 162, which introduces the offence of re-identifying data that has been de-identified. I will start by giving some background to this clause because, as noble Lords have mentioned, this is new to data protection legislation.
Pseudonymisation of datasets is increasingly commonplace in many organisations, both large and small. This is a very welcome development: where sensitive personal data is being processed in computerised files, it is important that people know that data controllers are taking cybersecurity seriously and that their records are kept confidential. Article 32 of the GDPR actively encourages controllers to implement technical and organisational measures to ensure an appropriate level of security, including, for example, through the pseudonymisation and encryption of personal data.
As noble Lords will be aware, the rapid advancement of technology has opened many doors for innovation in these sectors. However, as we continue to be able to do more with technology, the risk of its misuse also grows. Online data hackers and scammers are a much more prominent and substantial threat than was posed in 1998, when the original Data Protection Act was passed. It is appropriate, therefore, that the Bill addresses the contemporary challenges posed by today’s digital world. This clause responds to concerns raised by the National Data Guardian for Health and Care and other stakeholders regarding the security of data kept in online files, and is particularly timely following the well-documented cyberattacks on public and private businesses over the last few years.
It is important to note that the Bill recognises that there might be legitimate reasons for re-identifying data without the consent of the controller who encrypted it. The clause includes a certain number of defences, as my noble friend Lady Neville-Rolfe mentioned. These can be relied on in certain circumstances, such as where re-identification is necessary for the purpose of preventing or detecting crime, to comply with a legal obligation or is otherwise necessary in the public interest. I am aware that some academic circles, including Imperial College London, have raised concerns that this clause will prohibit researchers testing the robustness of data security systems. However, I can confidently reassure noble Lords that, if such research is carried out with the consent of the controller or the data subjects, no offence is committed. Even if the research is for legitimate purposes but carried out without the consent of the controller who de-identified the data in the first place, as long as researchers act quickly and responsibly to notify the controller, or the Information Commissioner, of the breach, they will be able to rely on the public interest defences in Clause 162. Finally, it is only an offence to knowingly or recklessly re-identify data, not to accidentally re-identify it. Clause 162(1) is clear on that point.
I turn to the specific amendments that have been tabled in this group. Amendment 170CA seeks to change the wording in line 3 from “de-identified” to “anonymised”. The current clause provides a definition of de-identification which draws upon the definition of “pseudonymisation” in article 4 of the GDPR. We see no obvious benefit in switching to “anonymised”. Indeed, it may be actively more confusing, given that recital 26 appears to use the term “anonymous information” to refer to information that cannot be re-identified, whereas here we are talking about data that can be—and, indeed, has been—re-identified.
Amendment 170CB seeks to provide an exemption for re-identification for the purpose of demonstrating how the personal data can be re-identified or is vulnerable to attacks. The Bill currently provides a defence for re-identification where the activity was consented to, was necessary for the purpose of preventing or detecting crime, was justified as being in the public interest, or where the person charged reasonably believes the activity was, or would have been, consented to. So long as those re-identifying datasets can prove that their actions satisfy any of these conditions, they will not be guilty of an offence. In addition, we need to be careful here not to create defences so wide that they become open to abuse.
Amendment 170CC seeks to add to the definition of what constitutes re-identification. I agree with the noble Lord that current techniques for re-identification involve linking datasets. However, we risk making the offence too prescriptive if we start defining exactly how re-identification will occur. As noble Lords, including the noble Lord, Lord Clement-Jones, mentioned, as technology evolves and offenders find new ways to re-identify personal data, we want the offence to keep pace.
Amendment 170E seeks to add an extra defence for persons who achieve approval for re-identifying de-identified personal data after the re-identification has taken place. The current clause provides a defence where a person acted in the reasonable belief that they would have had the consent of the controller or the data subject had they known about the circumstances of the re-identification. Retroactive approval for the re-identification could be relied on as evidence in support of that defence, so we believe that there is no need to provide a separate defence for retroactive consent.
My Lords, I think that the noble Lord is misreading the amendment. As I read my own amendment, I thought it was substitutional.
If we are talking about Amendment 170E, I am certainly prepared to look at that and address it.
That may have been the original intention, but perhaps it was never put properly into effect.
In which case, I will read Hansard, the noble Lord can do so and I am sure we will come to an arrangement. We can talk about that, if necessary.
Amendment 170F seeks to require the commissioner to produce a code of practice for the re-identification offence three months after Royal Assent. We can certainly explore with the commissioner what guidance is planned for this area and I would be happy to provide noble Lords with an update on that in due course. However, I would not like to tie the commissioner to providing guidance by a specific date on the face of the Bill. It is also worth mentioning here that, as we discussed on a previous day in Committee, the Secretary of State may by regulation require the commissioner to prepare additional codes of practice for the processing of personal data under Clause 124 and, given the issues that have been raised, we can certainly bear those powers in mind.
Finally, Amendments 170G and 170H would oblige the commissioner to set standards by which the controller is required to anonymise personal data and criminalise organisations which do not comply. I reassure noble Lords that much of this work is under way already and that the Information Commissioner’s Office has been working closely with government, data controllers and the National Cyber Security Centre to raise awareness about improving cybersecurity, including through the use of pseudonymisation of personal data.
It is important to point out that there is no weakening of the provisions contained in article 5 of the GDPR, which require organisations to ensure appropriate security of personal data. Failure to do so can, and will, be addressed by the Information Commissioner, including through the use of administrative penalties. Some have said that criminalising malicious re-identification would create complacency among data controllers. However, they still have every incentive to maintain security of their data. Theft is a criminal offence but I still lock my door at night. In addition, I am not convinced by the mechanism the noble Lord has chosen. In particular, criminalising failure to rely on guidance would risk uncertainty and unfairness, particularly if the guidance was wrong in law in any respect.
I accept that the issues noble Lords have raised are important but I hope that, in view of these reassurances, the amendment will be withdrawn, and that the House will accept that Clause 162 should stand part of the Bill. There are reasons for wanting to bring in this measure, and I can summarise them. These were recommendations in the review of data security, consent and opt-outs by the National Data Guardian, who called for the Government to introduce stronger sanctions to protect de-identified patient data. People are generally more willing to participate in medical research projects if they know that their data will be pseudonymised and held securely, and the Wellcome Trust, for example, is supportive of the clause. I hope that those reassurances will allow the noble Lord to withdraw his amendment and enable the clause to stand part of the Bill.
My Lords, I am grateful to all noble Lords who have contributed—in particular my noble friend Lord Lucas, who was even briefer than the noble Lord, Lord Clement-Jones. He made his point very succinctly and well.
With the greatest respect to the noble Lords, Lord Stevenson and Lord Clement-Jones—and I do mean that sincerely—during the passage of the 443 amendments in Committee that we are rapidly approaching the end of, we have listened carefully to each other, but in this case I am afraid that we reject Amendments 184 and 185 as being unnecessary. We believe that they are not required because the Bill already provides sufficient recourse for data subjects by allowing them to give consent to a non-profit organisation to represent their interests.
Clause 173, in conjunction with article 80(1) of the GDPR, provides data subjects with the right to authorise a non-profit organisation which has statutory objectives in the public interest and which is active in the field of data protection to exercise the rights described in Clauses 156 to 160 of the Bill. Taken together with existing provision for collective redress, and the ability of individuals and organisations to independently complain to the Information Commissioner where they have concerns, groups of like-minded data subjects will have a variety of redress mechanisms from which to choose. It is not true that when we have large numbers of data subjects they are unable, or too ignorant of their rights, to combine. For example, it is worth noting that more than 5,000 data subjects have brought one such action which is currently proceeding through the courts.
Furthermore, we would argue that the amendment is premature. If we were to make provision for article 80(2), it would be imperative to analyse the effectiveness not only of Clause 173 and article 80(1) of the GDPR but of other similar provisions in UK law to ensure that they are operating in the interests of data subjects and not third parties. We would also need to assess, for example, how effective the existing law has been in dealing with issues such as aggregate damages, which cases brought under article 80(2) might be subject to.
More generally, the Bill seeks to empower data subjects and ensure that they receive the information they need to enforce their own rights, with assistance from non-profit organisations if they wish. The solution to a perceived lack of data subject engagement cannot be to cut them out of the enforcement process as well. Indeed, there is a real irony here. Let us consider briefly a claim against a controller who should have sought, but failed to get, proper consent for their processing. Are noble Lords really suggesting that an unrelated third party should be able to enforce a claim for not having sought consent without first seeking that same consent?
We should also remember that these not-for-profit organisations are active in the field of data subjects’ rights; indeed, the GDPR states that they have to be. While many—the noble Lord, Lord Clement-Jones, mentioned Which?—will no doubt have data subjects’ true interests at heart and will be acting in those best interests, others will have a professional interest in achieving a different outcome: raising their own profile, for example.
I know that these amendments are well intentioned and I do have some sympathy with the ambition of facilitating greater private enforcement to complement the work of the Information Commissioner. But, for the reasons I have set out, I am not convinced that they are the right solution to the problems identified by noble Lords, and I therefore urge the noble Lord to withdraw his amendment.
My Lords, I am baffled by the Minister’s response. The Government have taken on board huge swathes of the GDPR; in fact, they extol the virtues of the GDPR, which is coming into effect, as are many of its articles. Yet they are baulking at a very clear statement in article 80(2), which could not be clearer. Their prevarication is extravagant.
The noble Lord will admit that the GDPR allows member states to do that; otherwise, it would have been made compulsory in the GDPR. The derogations are there to allow member states to decide whether or not to do it.
To summarise, we have chosen not to adopt article 80(2) because the Bill is based on the premise of getting consent—but these amendments are saying that, regardless of what the data subject wants or whether they have given consent, other organisations should be able to act on their behalf without their consent. That is the Government’s position and I hope that noble Lords will feel able not to press their amendments.
My Lords, government Amendments 185A, 185B, 185C and 185D add four fairly substantial new clauses to the Bill on the last day of Committee. I can see the point made by the Minister when he moved the amendments, but it is disappointing that they were not included right at the start. Have the Government just thought about them as a good thing?
The Delegated Powers and Regulatory Reform Committee has not had time to look at these matters. I note that in Amendment 185A, the Government suggest that regulations be approved by Parliament under the negative procedure. I will look very carefully at anything that the committee wants to bring to the attention of the House when we look at these matters again on Report. I am sure the committee will have reported by then.
I will not oppose the amendments today, but that is not to say that I will not move some amendments on Report—particularly if the committee draws these matters to the House’s attention.
My Lords, I want to echo that point. There is time for reflection on this set of amendments and I sympathise with what the noble Lord, Lord Kennedy, said.
My Lords, I am grateful for those comments. We understand that the DPRRC will have to look at the powers under the clause. As usual, as we have done already, we take great note of what the committee says; no doubt it will opine soon. We will pay attention to that.
(7 years ago)
Lords ChamberMy Lords, I am very grateful to the noble Lord, Lord Stevenson, for tabling this amendment, which allows us to return to our discussions on data ethics, which were unfortunately curtailed on the last occasion. The noble Lord invited me to give him a few choice words to summarise his amendments. I can think of a few choice words for some of his other amendments, but today I agree with a lot of the sentiment behind this one. It is useful to discuss this very important issue, and I am sure we will return to it. The noble Lord, Lord Puttnam, brought the 1931 Highway Code into the discussion, which was apposite, as I think the present Highway Code is about to have a rewrite due to autonomous vehicles—it is absolutely right, as he mentioned, that these codes have to be future-proofed. If there is one thing we are certain of, it is that these issues are changing almost by the day and the week.
The noble Lord, Lord Stevenson, has rightly highlighted a number of times during our consideration of the Bill that the key issue is the need for trust between individuals and data controllers. If there is no trust in what is set up under the Bill, then there will not be any buy-in from the general public. The noble Lord is absolutely right on that. That is why the Government are committed to setting up an expert advisory body on data ethics. The noble Lord mentioned the HFEA and the Committee on Climate Change, which are interesting prior examples that we are considering. I mentioned during our last discussion that the Secretary of State was personally leading on this important matter. He is committed to ensuring that just such a body is set up, and in a timely manner.
However, although I agree with and share the intentions that the noble Lord has expressed through this amendment, which other noble Lords have agreed with, I cannot agree with the mechanism through which he has chosen to express them. When we previously debated this topic, I was clear that we needed to draw the line between the function of an advisory ethics body and the Information Commissioner. The proposed ethics code in this amendment is again straddling this boundary.
Our new data protection law as found in this Bill and the GDPR will already require data controllers to do many of the things found in this amendment. Securing personal data, transparency of processing, clear consent, and lawful sharing and use are all matters set out in the new law. The commissioner will produce guidance, for that is already one of her statutory functions and, where the law is broken, the commissioner will be well equipped with enforcement powers. The law will be clear in this area, so all this amendment will do is add a layer of complexity.
The Information Commissioner’s remit is to provide expert advice on applying data protection law. She is not a moral philosopher. It is not her role to consider whether data processing is addressing inequalities in society or whether there are public benefits in data processing. Her role is to help us comply with the law to regulate its operation, which involves fairly handling complaints from data subjects about the processing of their personal data by controllers and processors, and to penalise those found to be in breach. The amendment that the noble Lord has tabled would extend the commissioner’s remit far beyond what is required of her as a UK supervisory authority for data protection and, given the breadth of the code set out in his amendment, would essentially require the commissioner to become a regulator on a much more significant scale than at present.
This amendment would stretch the commissioner’s resources and divert from her core functions. We need to examine the ethics of how data is used, not just personal data. However, the priority for the commissioner is helping us to implement the new law to ensure that the UK has in place the comprehensive data protection regime that we need and to help to prepare the UK for our exit from the EU. These are massive tasks and we must not distract the commissioner from them.
There is of course a future role for the commissioner to work in partnership with the new expert group on ethics that we are creating. We will explore that further once we set out our plans shortly. It is also worth noting that the Bill is equipped to future-proof the commissioner to take on this role: under Clause 124, the Secretary of State may by regulation require the commissioner to produce appropriate codes of practice. While the amendment has an arbitrary shopping list, much of which the commissioner is tasked with already, the Bill allows for a targeted code to be developed as and when the need arises.
The Government recognise the need for further credible and expert advice on the broader issues of the ethical use of data. As I mentioned last week, it is important that the new advisory body has a clearly defined role focused on the ethics of data use and gaps in the regulatory landscape. The body will as a matter of necessity have strong relationships with the Information Commissioner and other bodies that have a role in this space. For the moment, with that in mind, I would be grateful if the noble Lord withdrew his amendment. As I say, we absolutely understand the reasons behind it and we have taken on board the views of all noble Lords in this debate.
My Lords, do the Minister or the Government yet have a clear idea of whether the power in the Bill to draw up a code will be invoked, or whether there will be some other mechanism?
At the moment, I do not think there is any anticipation for using that power in the near future, but it is there if necessary in the light of the broader discussions on data ethics.
So the Minister believes it is going to be the specially set-up data ethics body, not the powers under the Bill, that would actually do that?
I do not want to be prescriptive on this because the data ethics body has not been set up. We know where we think it is going, but it is still to be announced and the Secretary of State is working on this. The legal powers are in the Bill, and the data ethics body is more likely to be an advisory body.
(7 years ago)
Lords ChamberThis was included in the letter I was sent today. I am afraid the noble Lord has not got it. The noble Lord, Lord Kennedy, helpfully withdrew his amendment before I was able to say anything the other night but the EU withdrawal Bill will convert the full text of direct EU instruments into UK law. This includes recitals, which will retain their status as an interpretive aid.
My Lords, we will see if the EU withdrawal Bill gets passed, but that is a matter for another day.
I thank the Minister for his remarks. There are many aspects of his reply which Members around the House will wish to unpick.
Perhaps I may pursue this for a second. It is late in the evening and I am not moving fast enough in my brain, but the recitals have been discussed time and again and it is great that we are now getting a narrow understanding of where they go. I thought we were transposing the GDPR, after 20 May and after Brexit, through Schedule 6. However, Schedule 6 does not mention the recitals, so if the Minister can explain how this magic translation will happen I will be very grateful.
I knew I was slow. We are moving to applied GDPR; that is correct. The applied GDPR, as I read it in the book—that great wonderful dossier that I have forgotten to table; I am sure the box can supply it when we need it—does not contain the recitals.
My Lords, just to heap Pelion on Ossa, I assume that until 29 March the recitals are not part of UK law.
They will be part of UK law, because the withdrawal Bill will convert the full text into UK law. There will of course be a difference between the recitals and the articles; it will be like a statutory instrument, where the Explanatory Memorandum is part of the text of the instrument.
May I add to this fascinating debate? Does this not illustrate one of the problems of the withdrawal Bill—that in many areas, of which this is one, there will be two potentially conflicting sources of English law? There will be this Act, on data protection, and the direct implementation through the EU withdrawal Bill on the same subject. The two may conflict because this Act will not contain the recitals.
My Lords, all I can say is that I do not know how the legal profession will cope in the circumstances.
One thing we can all be certain of is that the legal profession will cope.
(7 years ago)
Lords ChamberMy Lords, the noble Lord, Lord Stevenson, has raised some important points, which refer back to our labour over the Digital Economy Bill. One particular point occurs to me in relation to the questions that he asked: have we made any progress towards anonymisation in age verification, as we debated at some length during the passage of that Bill? As I recall, the Government’s point was that they did not think it necessary to include anything in the Bill because anonymisation would happen. The Minister should engage with that important issue. The other point that could be made is about whether the Government believe that the amendment of the noble Lord, Lord Lucas, would help us towards that goal.
My Lords, as we have heard, Part 3 of the Digital Economy Act 2017 requires online providers of pornographic material on a commercial basis to institute appropriate age verification controls. My noble friend’s Amendment 71ZA seeks to allow the age verification regulator to publish regulations relating to the protection of personal data processed for that purpose. The amendment aims to provide protection, choice and trust in respect of personal data processed for the purpose of compliance with Part 3 of the 2017 Act.
I think that I understand my noble friend’s aim. It is a concern I remember well from this House’s extensive deliberations on what became the Digital Economy Act, as referred to earlier. We now have before us a Bill for a new legal framework which is designed to ensure that protection, choice and trust are embedded in all data-processing practices, with stronger sanctions for malpractice. This partly answers my noble friend Lord Elton, who asked what we would produce to deal with this problem.
Personal data, particularly those concerning a data subject’s sex life or sexual orientation, as may be the case here, will be subject to rigorous new protections. For the reasons I have just mentioned, the Government do not consider it necessary to provide for separate standards relating exclusively and narrowly to age verification in the context of accessing online pornography. That is not to say that there will be a lack of guidance to firms subject to Part 3 of the 2017 Act on how best to implement their obligations. In particular, the age verification regulator is required to publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as compliant.
As noble Lords will be aware, the British Board of Film Classification is the intended age verification regulator. I reassure noble Lords that in its preparations for taking on the role of age verification regulator, the BBFC has indicated that it will ensure that the guidance it issues promotes the highest data protection standards. As part of this, it has held regular discussions with the Information Commissioner’s Office and it will flag up any potential data protection concerns to that office. It will then be for the Information Commissioner to determine whether action or further investigation is needed, as is her role.
The noble Lord, Lord Clement-Jones, talked about anonymisation and the noble Lord, Lord Stevenson, asked for an update of where we actually were. I remember the discussions on anonymisation, which is an important issue. I do not have the details of exactly where we have got to on that subject—so, if it is okay, I will write to the noble Lord on that.
I can update the noble Lord, Lord Stevenson, to a certain extent. As I just said, the BBFC is in discussion with the Information Commissioner’s Office to ensure that best practice is observed. Age verification controls are already in place in other areas of internet content access; for example, licensed gambling sites are required to have them in place. They are also in place for UK-based video-on-demand services. The BBFC will be able to learn from how these operate, to ensure that effective systems are created—but the age verification regulator will not be endorsing a list of age verification technology providers. Rather, the regulator will be responsible for setting guidance and standards on robust age verification checks.
We continue to work with the BBFC in its engagement with the industry to establish the best technological solutions, which must be compliant with data protection law. We are aware that such solutions exist, focusing rightly on verification rather than identification—which I think was the point made by the noble Lord, Lord Clement-Jones. If I can provide any more detail in the follow-up letter that I send after each day of Committee, I will do so—but that is the general background.
Online age verification is a rapidly growing area and there will be much innovation and development in this field. Industry is rightly putting data privacy and security at the forefront of its design, and this will be underscored by the new requirements under the GDPR. In view of that explanation, I hope that my noble friend will be able to withdraw his amendment.
My Lords, I thank the noble Lord, Lord Clement-Jones, who introduced this interesting debate; of course, I recognise his authority and his newfound expertise in artificial intelligence from being chairman of the Select Committee on Artificial Intelligence. I am sure that he is an expert anyway, but it will only increase his expertise. I thank other noble Lords for their contributions, which raise important issues about the increasing use of automated decision-making, particularly in the online world. It is a broad category, including everything from personalised music playlists to quotes for home insurance and far beyond that.
The noble Lord, Lord Stevenson, before speaking to his amendments, warned about some of the things that we need to think about. He contrasted the position on human embryology and fertility research and the HFEA, which is not exactly parallel because, of course, the genie is out of the bottle in that respect, and things were prevented from happening at least until the matter was debated. But I take what the noble Lord said and agree with the issues that he raised. I think that we will discuss in a later group some of the ideas about how we debate those broader issues.
The noble Baroness, Lady Jones, talked about how she hoped that the repressive bits would be removed from the Bill. I did not completely understand her point, as this Bill is actually about giving data subjects increased rights, both in the GDPR and the law enforcement directive. That will take direct effect, but we are also applying those GDPR rights to other areas not subject to EU jurisdiction. I shall come on to her amendment on the Human Rights Act in a minute—but we agree with her that human beings should be involved in significant decisions. That is exactly what the Bill tries to do. We realise that data subjects should have rights when they are confronted by significant decisions made about them by machines.
The Bill recognises the need to ensure that such processing is correctly regulated. That is why it includes safeguards, such as the right to be informed of automated processing as soon as reasonably practicable and the right to challenge an automated decision made by the controller. The noble Lord, Lord Clement-Jones, alluded to some of these things. We believe that Clauses 13, 47, 48, 94 and 95 provide adequate and proportionate safeguards to protect data subjects of all ages, adults as well as children. I can give some more examples, because it is important to recognise data rights. For example, Clause 47 is clear that individuals should not be subject to a decision based solely on automated processing if that decision significantly and adversely impacts on them, either legally or otherwise, unless required by law. If that decision is required by law, Clause 48 specifies the safeguards that controllers should apply to ensure the impact on the individual is minimised. Critically, that includes informing the data subject that a decision has been taken and providing them 21 days within which to ask the controller to reconsider the decision or retake the decision with human intervention.
I turn to Amendments 74, 134 and 136, proposed by the noble Lord, Lord Clement-Jones, which seek to insert into Parts 2 and 3 of the Bill a definition of the term,
“based solely on automated processing”,
to provide that human intervention must be meaningful. I do not disagree with the meaning of the phrase put forward by the noble Lord. Indeed, I think that that is precisely the meaning that that phrase already has. The test here is what type of processing the decision having legal or significant effects is based on. Mere human presence or token human involvement will not be enough. The purported human involvement has to be meaningful; it has to address the basis for the decision. If a decision was based solely on automated processing, it could not have meaningful input by a natural person. On that basis, I am confident that there is no need to amend the Bill to clarify this definition further.
In relation to Amendments 74A and 133A, the intention here seems to be to prevent any automated decision-making that impacts on a child. By and large, the provisions of the GDPR and of the Bill, Clause 8 aside, apply equally to all data subjects, regardless of age. We are not persuaded of the case for different treatment here. The important point is that the stringent safeguards in the Bill apply equally to all ages. It seems odd to suggest that the NHS could, at some future point, use automated decision-making, with appropriate safeguards, to decide on the eligibility for a particular vaccine—
My Lords, I hesitate to interrupt the Minister, but it is written down in the recital that such a measure,
“should not concern a child”.
The whole of that recital is to do with automated processing, as it is called in the recital. The interpretation of that recital is going to be rather important.
My Lords, I was coming to recital 71. In the example I gave, it seems odd to suggest that the NHS could at some future point use automated decision-making with appropriate safeguards to decide on the eligibility for a particular vaccine of an 82 year-old, but not a two year-old.
The noble Lord referred to the rather odd wording of recital 71. On this point, we agree with the Article 29 working party—the group of European regulators—that it should be read as discouraging as a matter of best practice automated decision-making with significant effects on children. However, as I have already said, there can and will be cases where it is appropriate, and the Bill rightly makes provision for those.
Would the Minister like to give chapter and verse on how that distinction is made?
I think that “chapter and verse” implies “written”—and I will certainly do that because it is important to write to all noble Lords who have participated in this debate. As we have found in many of these areas, we need to get these things right. If I am to provide clarification, I will want to check—so I will take that back.
I apologise for interrupting again. This is a bit like a dialogue, in a funny sort of way. If the Minister’s notes do not refer to the Article 29 working party, and whether or not we will continue to take guidance from it, could he include that in his letter as well?
I will. I had some inspiration from elsewhere on that very subject—but it was then withdrawn, so I will take up the offer to write on that. However, I take the noble Lord’s point.
We do not think that Amendment 75 would work. It seeks to prevent any decision being taken on the basis of automated decision-making where the decision would “engage” the rights of the data subject under the Human Rights Act. Arguably, such a provision would wholly negate the provisions in respect of automated decision-making as it would be possible to argue that any decision based on automated decision-making at the very least engaged the data subject’s right to have their private life respected under Article 8 of the European Convention on Human Rights, even if it was entirely lawful. All decisions relating to the processing of personal data engage an individual’s human rights, so it would not be appropriate to exclude automated decisions on this basis. The purpose of the Bill is to ensure that we reflect processing in the digital age—and that includes automated processing. This will often be a legitimate form of processing, but it is right that the Bill should recognise the additional sensitivities that surround it. There must be sufficient checks and balances and the Bill achieves this in Clauses 13 and 48 by ensuring appropriate notification requirements and the right to have a decision reassessed by non-automated means.
I highlight that we do not disagree with that. I will study carefully what my noble friend Lord Lucas said. We agree that it is important that privacy rights continue to be protected, and we do not expect data subjects to have their lives run by computer alone. That is exactly why the Bill creates safeguards: to make sure that individuals can request not to be the subject of decisions made automatically if it might have a significant legal effect on them. They are also allowed to demand that a human being participate meaningfully in those decisions that affect them. I will look at what my noble friend said and include that in my write-round. However, as I said, we do not disagree with that. The illusion that we have got to a stage where our lives will be run unaccountably by computers is exactly what the Bill is trying to prevent.
My Lords, I would not want to give that impression. None of us are gloom merchants in this respect. We want to be able to harness the new technology in a way that is appropriate and beneficial for us, and we do that by setting the right framework in data protection, ethical behaviour and so on.
I am grateful to the Minister for engaging in the way he has on the amendments. It is extremely important to probe each of those areas of Clauses 13, 47 and 48. For instance, there are lacunae. The Minister talked about the right to be informed and the right to challenge, and so on, and said that these provided adequate and proportional safeguards, but the right to explanation is not absolutely enshrined, even though it is mentioned in the GDPR. So in some areas we will probe on that.
Yes, my Lords, but it is in the recital, so I think we come back again to whether the recitals form part of the Bill. That is what I believe to be the case. I may have to write to the Minister. Who knows? Anything is possible.
One of the key points—raised by the noble Lord, Lord Lucas—is the question of human intervention being meaningful. To me, “solely”, in the ordinary meaning of the word, does not mean that human intervention is there at all, and that is a real worry. The writ of the article 29 working group may run until Brexit but, frankly, after Brexit we will not be part of the article 29 working group, so what interpretation of the GDPR will we have when it is incorporated into UK domestic law? If those rights are not to be granted, the interpretation of “solely” with the absolute requirement of human involvement needs to be on the face of the Bill.
As far as recital 71 is concerned, I think that the Minister will write with his interpretation and about the impact of the article 29 working group and whether we incorporate its views. If the Government are not prepared to accept that the rulings of the European Court of Justice will be effective in UK law after Brexit, I can only assume that the article 29 working group will have no more impact. Therefore, there is a real issue there.
I take the Minister’s point about safeguards under the Equality Act. That is important and there are other aspects that we will no doubt wish to look at very carefully. I was not overly convinced by his answer to Amendment 75, spoken to by the noble Baroness, Lady Jones, and my noble friend Lady Hamwee, because he said, “Well, it’s all there anyway”. I do not think we would have had to incorporate those words unless we felt there was a gap in the way the clause operated.
I will not take the arguments any further but I am not quite as optimistic as the Minister about the impact of that part of the Bill, and we may well come back to various forms of this subject on Report. However, it would be helpful if the Minister indicated the guidance the ICO is adopting in respect of the issue raised in Amendment 153A. When he writes, perhaps he could direct us to those aspects of the guidance that will be applicable in order to help us decide whether to come back to Amendment 153A. In the meantime, I beg leave to withdraw.
My Lords, the noble Lord, Lord Stevenson, has raised the important issue of data ethics. I am grateful to everyone who has spoken on this issue tonight and has agreed that it is very important. I assure noble Lords that we agree with that. We had a debate the other day on this issue and I am sure we will have many more in the future. The noble Lord, Lord Puttnam, has been to see me to talk about this, and I tried to convince him then that we were taking it seriously. By the sound of it, I am not sure that I completely succeeded, but we are. We understand the points he makes, although I am possibly not as gloomy about things as he is.
We are fortunate in the UK to have the widely respected Information Commissioner to provide expert advice on data protection issues—I accept that that advice is just on data protection issues—but we recognise the need for further credible and expert advice on the broader issue of the ethical use of data. That is exactly why we committed to setting up an expert advisory data ethics body in the 2017 manifesto, which, I am glad to hear, the noble Lord, Lord Clement-Jones, read carefully.
We like to hold the Government to their manifesto commitments occasionally.
Tonight the noble Lord can because the Secretary of State is leading on this important matter. She is as committed as I am to ensuring that such a body is set up shortly. She has been consulting widely with civil society groups, industry and academia, some of which has been mentioned tonight, to refine the scope and functions of the body. It will work closely with the Information Commissioner and other regulators. As the noble Lords, Lord Clement-Jones and Lord Patel, mentioned, it will identify gaps in the regulatory landscape and provide Ministers with advice on addressing those gaps.
It is important that the new advisory body has a clearly defined role and a strong relationship to other bodies in this space, including the Information Commissioner. The Government’s proposals are for an advisory body which may have a broader remit than that suggested in the amendment. It will provide recommendations on the ethics of data use in gaps in the regulatory landscape, as I have just said. For example, one fruitful area could be the ethics of exploiting aggregated anonymised datasets for social and commercial benefit, taking into account the importance of transparency and accountability. These aggregated datasets do not fall under the legal definition of personal data and would therefore be outside the scope of both the body proposed by the noble Lord and, I suspect, this Bill.
Technically, Amendment 78 needs to be more carefully drafted to avoid the risk of non-compliance with the GDPR and avoid conflict with the Information Commissioner. Article 51 of the GDPR requires each member state to appoint one or more independent public authorities to monitor and enforce the GDPR on its territory as a supervisory authority. Clause 113 makes the Information Commissioner the UK’s sole supervisory authority for data protection. The functions of any advisory data ethics body must not cut across the Information Commissioner’s performance of its functions under the GDPR.
The amendment proposes that the advisory board should,
“monitor further technical advances in the use and management of personal data”.
But one of the Information Commissioner’s key functions is to
“keep abreast of evolving technology”.
That is a potential conflict we must avoid. The noble Lord, Lord Patel, alluded to some of the conflicts.
Nevertheless, I agree with the importance that noble Lords place on the consideration of the ethics of data use, and I repeat that the Government are determined to make progress in this area. However, as I explained, I cannot agree to Amendment 78 tonight. Therefore, in the light of my explanation, I hope the noble Lord will feel able to withdraw it.
(7 years ago)
Lords ChamberMy Lords, I am grateful to all noble Lords who have spoken and for the opportunity to speak to Schedule 1 in relation to an industry in which I spent many years. I accept many of the things that the noble Earl, Lord Kinnoull, described and completely understand many of his points—and, indeed, many of the points that other noble Lords have made. As the noble Lord, Lord Clement-Jones, said, I have taken the noble Earl’s examples to heart, and I absolutely accept the importance of the insurance industry. The Government have worked with the Association of British Insurers and others to ensure that the Bill strikes the right balance between safeguarding the rights of data subjects and processing data without consent when necessary for carrying on insurance business—and a balance it must be. The noble Lord, Lord Stevenson, alluded to some of those issues when he took us away from the technical detail of his amendment to a higher plane, as always.
The noble Earl, Lord Kinnoull, and the noble Lords, Lord Clement-Jones and Lord Stevenson, have proposed Amendments 45B, 46A, 47, 47A, 48A and 50A, which would amend or replace paragraphs 14 and 15 of Schedule 1, relating to insurance. These amendments would have the effect of providing a broad basis for processing sensitive types of personal data for insurance-related purposes. Amendment 45B, in particular, would replace the current processing conditions for insurance business set out in paragraphs 14 and 15 with a broad condition covering the arrangement, underwriting, performance or administration of a contract of insurance or reinsurance, but the amendment does not provide any safeguards for the data subject.
Amendment 47 would amend the processing condition relating to processing for insurance purposes in paragraph 14. This processing condition was imported from paragraph 5 of the 2000 order made under the Data Protection Act 1998. Removal of the term might lessen the safeguards for data subjects, because insurers could potentially rely on the provisions even where it was reasonable to obtain consent. I shall come to the opinions of the noble Earl, Lord Erroll, on consent in a minute.
Amendments 46A, 47A, 48A and 50A are less sweeping, but would also remove safeguards and widen the range of data that insurers could process to far beyond what the current law allows. The Bill already contains specific exemptions permitting the processing of family health data to underwrite the insured’s policy and data required for insurance policies on the life of another or group contract. We debated last week a third amendment to address the challenges of automatic renewals.
These processing conditions are made under the substantial public interest derogation. When setting out the grounds for such a derogation, the Government are limited—this partly addresses the point made by the noble Lord, Lord Stevenson—by the need to meet the “substantial public interest test” in the GDPR and the need to provide appropriate safeguards for the data subject. A personal or private economic or commercial benefit is insufficient: the benefits for individuals or society need to significantly outweigh the need of the data subject to have their data protected. On this basis, the Government consider it difficult to justify a single broad exemption. Taken together, the Government remain of the view that the package of targeted exemptions in the Bill is sufficient and achieves the same effect.
Nevertheless, noble Lords have raised some important matters and the Government believe that the processing necessary for compulsory insurance products must be allowed to proceed without the barriers that have been so helpfully described. The common thread in these concerns is how consent is sought and given. The noble Earl, Lord Kinnoull, referred to that and gave several examples. The Information Commissioner has published draft guidance on consent and the Government have been in discussions with her office on how the impact on business can be better managed. We will ensure that we resolve the issues raised.
I say to the noble Earl, Lord Erroll, that consent is important and the position taken by the GDPR is valid. We do not have a choice in this: the GDPR is directly applicable and when you are dealing with data, it is obviously extremely important to get consent, if you can. The GDPR makes that a first line of defence, although it provides others when consent is not possible. As I say, consent is important and it has to be meaningful consent, because we all know that you can have a pre-tick box and that is not what most people nowadays regard as consent. Going back to the noble Earl, Lord Kinnoull—
My Lords, I am sorry to interrupt. The Minister mentioned the guidance from the Information Commissioner. From what he said, I assume he knows that the insurance industry does not believe that the guidance is sufficient; it is inadequate for its purposes. Is he saying that a discussion is taking place on how that guidance might be changed to meet the purposes of the insurance industry? If it cannot be changed, will he therefore consider amendments on Report?
Of course, it is not for us to tell the Information Commissioner what guidance to issue. The guidance that has been issued is not in all respects completely helpful to the insurance industry.
I agree; I think I mentioned compulsory classes before. Going back to the guidance, we are having discussions. We have already had constructive discussions with the noble Earl, and we will have more discussions on this subject with the insurance industry, in which he has indicated that he would like to take part. I am grateful to him for coming to see me last week.
My Lords, I am sorry to interrupt the Minister again but he is dealing with important concepts. Right at the beginning of his speech he said he did not think this could be covered by the substantial public interest test. Surely the continuance of insurance in all those different areas, not just for small businesses but for the consumer, and right across the board in the retail market, is of substantial public interest. I do not quite understand why it does not meet that test.
I may have misled the noble Lord. I did not say that it does not meet the substantial test but that we had to balance the need to meet the substantial public interest test in the GDPR and the need to provide appropriate safeguards for the data subject. I am not saying that those circumstances do not exist. There is clearly substantial public interest that, as we discussed last week, compulsory classes of insurance should be able to automatically renew in certain circumstances. I am sorry if I misled the noble Lord.
We realised that there are potentially some issues surrounding consent, particularly in the British way of handling insurance where you have many intermediaries, which creates a problem. That may also take place in other countries, so the Information Commissioner will also look at how they address these issues, because there is meant to be a harmonious regime across Europe. The noble Earl has agreed to come and talk to us, and I hope that on the basis of further discussions, he will withdraw his amendment.
We can break it down simply between compulsory and non-compulsory classes. Some classes may more easily fulfil the substantial public interest test than others. In balancing the needs, it goes too far to give a broad exemption for all insurance, so we are trying to create a balance. However, we accept that compulsory classes are important.
I am sure that the noble Earl, Lord Kinnoull, will come back at greater length on this. The issue that the Minister has outlined is difficult, partly because the Information Commissioner plays and will play such an important role in the interpretation of the Bill. When the Government consider the next steps and whether to table their own amendments or accept other amendments on Report, will they bring the Information Commissioner or her representative into the room? It seems that the guidance and the interaction of the guidance with the Bill—and, eventually, with the Act—will be of extreme importance.
I agree, which is why I mentioned the guidance that the Information Commissioner has already given. I am certainly willing to talk to her but it is not our place to order her into the room. However, we are constantly talking to her, and there is absolutely no reason why we would not do so on this important matter.
(7 years ago)
Lords ChamberI agree. I have the same. You have to put in your numerical password every so often just to check that you have still got the same finger. Technically, you might not have.
The amendments also seek to permit the processing of such data when biometric identification devices are installed by employers to allow employees to gain access to work premises or when the controller is using the data for internal purposes to improve ID verification mechanisms. I am grateful to the noble Lord for raising this important issue because the use of biometric verification devices is likely only to increase in the coming years. At the moment, our initial view is that, given the current range of processing conditions provided in Schedule 1 to the Bill, no further provision is needed to facilitate the activities to which the noble Lord referred. However, this is a technical issue and so I am happy to write to the noble Lord to set out our reasoning on that point. Of course, this may not be the case in relation to the application of future technology, and we have already discussed the need for delegated powers in the Bill to ensure that the law can keep pace. I think we will discuss that again in a later group.
On this basis, I hope I have tackled the noble Lord’s concerns, and I would be grateful if he will withdraw the amendment.
My Lords, as usual the noble Lord, Lord Maxton, has put his finger on the problem. If we have iris recognition, he will keep his eye on the matter.
I thank the Minister for his explanation of the multifarious amendments and welcome the maiden speech from the Front Bench by the noble Lord, Lord Griffiths. I do not think I can better my noble friend Lord McNally’s description of his ascent to greatness in this matter. I suspect that in essence it means that the noble Lord, Lord Griffiths, like me, picks up all the worst technical amendments which are the most difficult to explain in a short speech.
I thought the Minister rather short-changed some of the amendments, but I will rely on Hansard at a later date, and I am sure the Opposition Front Bench will do the same when we come to it. The particular area where he was disappointing was on what you might call the Thomson Reuters perspective, and I am sure that we will want to examine very carefully what the Minister had to say because it could be of considerable significance if there is no suitable exemption to allow that kind of fraud prevention to take place. Although he said he had an open mind, I was rather surprised by his approach to Amendments 45A and 64 which were tabled by the noble Baroness, Lady Neville-Jones. One will have to unpick carefully what he said.
The bulk of what I want to respond to is what the Minister said about biometrics. I took quite a lot of comfort from what he said because he did not start quoting chapter and verse at me, which I think means that nobody has quite yet worked out where this biometric data fits and where there might be suitable exemptions. There is a general feeling that somewhere in the Bill or the schedules we will find something that will cover it. I think that may be an overoptimistic view, but I look forward to receiving the Minister’s letter. In the meantime, I beg leave to withdraw the amendment.
(7 years ago)
Lords ChamberMy Lords, that is not an unexpected question. I can assure the noble Lord that we are not putting this into the long grass. He is absolutely right that there was a six-week evidence-gathering session. The evidence gathered has convinced us of the need to take action and reduce the maximum FOBT stakes. However, it is a complex issue and not about stakes alone. We are therefore publishing today a package of measures to address the concerns. We must strike the right balance between the socially responsible growth of the industry and the protection of consumers and the communities they live in. Our position is that the maximum stake should be between £50 and £2. We are consulting on that specific issue. This has to be done with due process to avoid any further problems which may come in the future with doing it in too rushed a manner.
My Lords, Liberal Democrats have been calling for a £2 stake on these highly addictive machines, which have been a catalyst of problem gambling, social breakdown and serious crime in communities, for nearly a decade. We therefore give a qualified welcome to the review, but, rather like the noble Lord, Lord Griffiths, we are disappointed that a range of options rather than a firm recommendation is being given, and that we now have a 12-week consultation rather than action. Reducing the maximum stake to £50 would still mean that you could lose £750 in five minutes, or £300 if the stake was reduced to £20. I urge the Minister and his colleagues to resist Treasury pressure and move to take effective action by focusing on stake reduction to £2, which would put a clear and sensible limit on all high street machines. Can the Minister tell us what the role of the Gambling Commission has been and will be in the consultation? It has a duty to minimise gambling-related harm and protect children and the vulnerable. Will the Government act on that advice? Will the review examine the proliferation of betting shops on the high street and the self-referral or exclusion system, which is so ineffective? As well as reducing the maximum stake, will it look at limiting the spin rate? Finally, will the consultation address stakes in online equivalents to these games, such as blackjack?
My Lords, the noble Lord makes a predictable comment about Treasury pressure, of which there was none. The decision on stakes will come from DCMS and not from the Treasury—although it will take into account fiscal implications, as it does for any government policy. The Gambling Commission is involved in the consultation because it is involved also in the other package of measures covered by it. The consultation is not just on the stakes but on other matters such as tougher licence conditions. The noble Lord referred to spin rates. What one can lose where higher stakes are concerned depends on the spin rate. I can confirm that that will be included in the consultation. I urge the noble Lord and the noble Lord, Lord Griffiths, to contribute to the consultation and make their views known.
(7 years ago)
Lords ChamberMy Lords, the Minister gave the impression that medical research of the type described by the noble Lord, Lord Patel, was encompassed, or allowable, by the GDPR. Can he give chapter and verse on where in the mixture of article 6 and article 9 that occurs? That would be extremely helpful. I understand that obviously the Minister was also agreeing to look further in case those articles did not cover the situation, but it would be good to know which articles he is referring to.
I re-emphasise to the noble Lord that we think these tasks are in the public interest. However, I understand his desire for even more clarity than that. It would be sensible if I wrote to him and to other noble Lords taking part in the debate. I want to make sure that I get the legal basis right rather than just doing it on the hoof, so I agree to write to him and to all noble Lords who have spoken tonight. Again, as I say, we will work towards what I hope will be a more acceptable solution for everyone. Fundamentally, we do not want to impede medical research that is for the public good.
(7 years ago)
Lords ChamberMy Lords, this is a rather unusual occasion, in that normally noble Lords say that they are going to read very carefully what the Minister has said in Hansard. In this case, I am certainly going to have to read carefully what the noble Lord, Lord Clement-Jones, said, in Hansard. This is a complicated matter and I thought that I was following it and then thought that I did not—and then I thought that I did again. I shall set out what I think should be the answer to his remarks, but when we have both read Hansard we may have to get together again before Report on this matter.
I am glad that we have this opportunity to set out the approach taken in the Bill to processing that is in the public interests and the substantial public interests. Both terms are not new; they appeared before 1998, as the noble Lord, Lord Stevenson, said, in the 1995 data protection directive, in the same sense as they are used in the GDPR and the Bill. That is to say, “substantial public interest” is one of the bases for the processing of special categories of personal data, and this is a stricter test than the public interest test that applies in connection with the processing of all categories of personal data. The noble Lord, Lord Clement-Jones, was wrong to suggest that the list provided in the 1998 Act in relation to public interest was genuinely exhaustive, I think. As he said himself, the effect of paragraph 5(d) of Schedule 2 was to make that list non-exhaustive.
In keeping with the approach taken under the 1998 Act, the Government have not limited the public interest general processing condition. The list in Clause 7 is therefore non-exhaustive. This is intentional, and enables organisations which undertake legitimate public interest tasks to continue to process general data. Noble Lords may recall that the Government committed after Second Reading to update the Explanatory Notes to provide reassurance that Clause 7 should be interpreted broadly. Universities, museums and many other organisations carrying out important work for the benefit of society all rely on this processing condition. For much the same reason, “public interest” has not historically been defined in statute, recognising that the public interest will change over time and according to the circumstances of each situation. This flexibility is important, and I would not wish to start down the slippery slope of attempting to define it further.
The Government have, however, chosen to set out in Part 2 of Schedule 1 an exhaustive list of types of processing which they consider constitute, or could constitute, processing in the substantial public interest. That reflects the increased risks for data subjects when their sensitive personal data is processed. Again, this approach replicates that taken in the 1998 Act. Where the Government consider that processing meeting a condition in that part will sometimes, but not necessarily, meet the substantial public interest test, a sub-condition to that effect is included. This ensures that the exemption remains targeted on those processing activities in the substantial public interest. A similar approach was taken in secondary legislation made under the 1998 Act. The Government intend to keep Part 2 of Schedule 1 under review, and have proposed a regulation-making power in Clause 9 that would allow Schedule 1 to be updated or refined in a timelier manner than would be the case if primary legislation were required. We will of course return to that issue in a later group.
Amendment 15 seeks to make clear that the public interest test referred to in Clause 7 is not restricted by the substantial public interest test referred to in Part 2 of Schedule 1. Having described the purposes of both these elements of the Bill, I hope that noble Lords can see that these are two separate tests. The different wording used would mean that these would be interpreted as different tests, and there is no need to amend the Bill to clarify that further.
Amendment 154 would require the Information Commissioner to develop a code of practice in relation to the processing of personal data in the public interest and substantial public interest. As we have already touched on, the Information Commissioner is developing relevant guidance to support the implementation of the new data protection framework. Should there later prove a need to formalise this guidance as a code of practice, Clause 124 provides the Secretary of State with the power to direct the Information Commissioner to make such a code. There is no need to make further provision.
I hope that that explanation satisfies noble Lords for tonight, and I urge the noble Lord to withdraw his amendment. However, in this complicated matter, I am certainly prepared to meet noble Lords to discuss this further, if they so require.
My Lords, I thank the Minister for that very helpful exposition. I shall return the compliment and read his contribution in Hansard with great care. I apologise to the noble Lord, Lord Kennedy, if the Bill has already had a befuddling influence on me. It comes from looking along the Labour Benches too much in profile.
With this amendment, I feel somewhat caught between the noble Lord, Lord Patel, and a very hard place. Clearly, he wants flexibility in a public interest test, and I can well understand that. But there are issues to which we shall need to return. The idea of a specific code seems the way forward; the way forward is not by granting overmighty powers to the Government to change the definitions according to the circumstances. I think that that was the phrase that the Minister used—they wish to have that flexibility so that the public interest test could be varied according to circumstances. If there is a power to change, it has to be pretty circumscribed. Obviously, we will come back to that in a later group. In the meantime, I beg leave to withdraw the amendment.
(7 years, 2 months ago)
Lords ChamberTo ask Her Majesty’s Government what progress they have made with their Review of Gaming Machines and Social Responsibility Measures.
My Lords, the review generated a lot of interest from the general public, as well as from a variety of interest groups, local authorities, trade bodies and industry. As the Minister for Sport and Civil Society made clear in the other place before the Recess, any announcement will not be made until October at the earliest.
My Lords, that is not an unexpected reply. Does the Minister accept that the NatCen report published last month provides clear evidence that 43% of FOBT users are either problem or at-risk gamblers? In that light, does he accept that it is high time that the Government end their internal debate, override the Treasury objections and act to reduce the committed stake and slow the speed of play on these dangerous machines without any further delay?
My Lords, the noble Lord has misunderstood several things. First, the Chancellor has said publicly that he fully supports the work of the DCMS to ensure that the UK’s gambling regime continues to balance the needs of vulnerable people, consumers who gamble responsibly and those who work in this sector. Of the 2.38 million who are at risk, 1.4 million are at low risk, and I completely understand the noble Lord’s point about 430,000 problem gamblers being 430,000 too many. That is exactly why we are having the review, which we hope will be published soon. We will then be able to do something about it, depending on what the options are.