Debates between Lord Ashton of Hyde and Lord Lucas during the 2017-2019 Parliament

Mon 13th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 3rd sitting (Hansard): House of Lords
Mon 13th Nov 2017
Data Protection Bill [HL]
Lords Chamber

Committee: 3rd sitting (Hansard - continued): House of Lords

Mobile Networks: Resilience

Debate between Lord Ashton of Hyde and Lord Lucas
Tuesday 11th December 2018

(5 years, 5 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

To ask Her Majesty’s Government what steps they will take to improve the resilience of United Kingdom mobile networks following the outage of O2’s services.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - -

My Lords, I ought to declare a very small interest as a customer of O2 and, therefore, someone who is in line for a reimbursement of two days-worth of my monthly subscription.

There is a regular dialogue on interests of concern to both industry and Government. DCMS works closely with the telecoms sector on resilience issues via the Electronic Communications Resilience and Response Group, which leads on resilience activity and emergency response. The industry has a good track record of enhancing resilience, and we will be working closely with O2 and the wider sector to understand the causes of this incident and what lessons can be learned.

Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I thank my noble friend for that encouraging Answer. He will be aware that O2 is not the only recent example of lack of systems resilience. Work undertaken by the Government in preparation for a possible hard Brexit has revealed that a very large proportion of British business is driving extremely close to the edge of chaos in terms of how little it would take to seriously disrupt their businesses and our lives. Will he encourage his colleagues to encourage businesses, once Brexit is past us, to maintain the provisions they are now making against possible difficulties, in the cause of our running a more resilient society than we apparently have been doing?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

I assure my noble friend that my department, which is responsible for telecoms, will continue to work with the Electronic Communications Resilience and Response Group. By coincidence, there is a meeting of that group next week, from which we will find out exactly what happened with the O2 outage and the emergency response, which worked well. I can assure my noble friend that we will continue with that, whatever happens with Brexit.

Data Protection Bill [HL]

Debate between Lord Ashton of Hyde and Lord Lucas
Lord Lucas Portrait Lord Lucas (Con)
- Hansard - - - Excerpts

My Lords, I want to pick up on the last point of the noble Lord, Lord McNally. We are getting into a situation where political parties are addressing personal messages to individual voters and saying different things to different voters. This is not apparent; there must be ways to control it. We will have to give some considerable thought to it, so I see the virtue of the amendments.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

Quickly, because I will not remember all the questions and points, I want to emphasise that they are all very good points and I will reflect on them. My main mission is to get the GDPR and law enforcement directive in place by May 2018. I absolutely accept the point made by the noble Lord, Lord McNally—that this is the tip of iceberg—but we must bear in mind that this is about data protection, both today and on Report, so I will focus on that. We have already had other avenues to raise a lot of the points the noble Lord made, but I agree that it is a huge issue. He asked when the report from the Information Commissioner will be available. I would expect it before Christmas, so it will be before the Bill becomes law.

I certainly undertake to reflect on what the noble Baroness, Lady Jay, said about the Electoral Commission. I believe that our call for views was after the election; nevertheless, I take her point. I am very sorry but I cannot remember what the point from the noble Lord, Lord Whitty, was, but I accept these things have to be taken into account. When we have our meeting—it is becoming a big meeting—it will be for people concerned specifically with the Data Protection Act, not some of the issues that lie outside that narrow area, important though they are.

I ask noble Lords not to press their amendments.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, picking up on the last point from the noble Baroness, Lady Hamwee, is this the first time the privileges of Members of this House have been reduced in relation to Members of the other House? If so, will the Government consult the Speaker of this House on whether he considers that desirable?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, they have not been reduced. This is the position that exists today.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, privileges are being given to Members of another place—and indeed to Members of the Parliaments of Scotland and other places—that are being denied to us. Is this the first time that has been done?

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

No, it is not the first time because this is the position that exists under the Data Protection Act 1998.

Data Protection Bill [HL]

Debate between Lord Ashton of Hyde and Lord Lucas
Monday 13th November 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, as we have heard, Part 3 of the Digital Economy Act 2017 requires online providers of pornographic material on a commercial basis to institute appropriate age verification controls. My noble friend’s Amendment 71ZA seeks to allow the age verification regulator to publish regulations relating to the protection of personal data processed for that purpose. The amendment aims to provide protection, choice and trust in respect of personal data processed for the purpose of compliance with Part 3 of the 2017 Act.

I think that I understand my noble friend’s aim. It is a concern I remember well from this House’s extensive deliberations on what became the Digital Economy Act, as referred to earlier. We now have before us a Bill for a new legal framework which is designed to ensure that protection, choice and trust are embedded in all data-processing practices, with stronger sanctions for malpractice. This partly answers my noble friend Lord Elton, who asked what we would produce to deal with this problem.

Personal data, particularly those concerning a data subject’s sex life or sexual orientation, as may be the case here, will be subject to rigorous new protections. For the reasons I have just mentioned, the Government do not consider it necessary to provide for separate standards relating exclusively and narrowly to age verification in the context of accessing online pornography. That is not to say that there will be a lack of guidance to firms subject to Part 3 of the 2017 Act on how best to implement their obligations. In particular, the age verification regulator is required to publish guidance about the types of arrangements for making pornographic material available that the regulator will treat as compliant.

As noble Lords will be aware, the British Board of Film Classification is the intended age verification regulator. I reassure noble Lords that in its preparations for taking on the role of age verification regulator, the BBFC has indicated that it will ensure that the guidance it issues promotes the highest data protection standards. As part of this, it has held regular discussions with the Information Commissioner’s Office and it will flag up any potential data protection concerns to that office. It will then be for the Information Commissioner to determine whether action or further investigation is needed, as is her role.

The noble Lord, Lord Clement-Jones, talked about anonymisation and the noble Lord, Lord Stevenson, asked for an update of where we actually were. I remember the discussions on anonymisation, which is an important issue. I do not have the details of exactly where we have got to on that subject—so, if it is okay, I will write to the noble Lord on that.

I can update the noble Lord, Lord Stevenson, to a certain extent. As I just said, the BBFC is in discussion with the Information Commissioner’s Office to ensure that best practice is observed. Age verification controls are already in place in other areas of internet content access; for example, licensed gambling sites are required to have them in place. They are also in place for UK-based video-on-demand services. The BBFC will be able to learn from how these operate, to ensure that effective systems are created—but the age verification regulator will not be endorsing a list of age verification technology providers. Rather, the regulator will be responsible for setting guidance and standards on robust age verification checks.

We continue to work with the BBFC in its engagement with the industry to establish the best technological solutions, which must be compliant with data protection law. We are aware that such solutions exist, focusing rightly on verification rather than identification—which I think was the point made by the noble Lord, Lord Clement-Jones. If I can provide any more detail in the follow-up letter that I send after each day of Committee, I will do so—but that is the general background.

Online age verification is a rapidly growing area and there will be much innovation and development in this field. Industry is rightly putting data privacy and security at the forefront of its design, and this will be underscored by the new requirements under the GDPR. In view of that explanation, I hope that my noble friend will be able to withdraw his amendment.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I am very grateful for my noble friend’s reply. With his leave, I will digest it overnight and tomorrow. I look forward to the letter that he promised—but if, at the end of that, I still think that there is something worth discussing, I hope that his ever-open door will be open even to that.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

I believe that during our previous day in Committee, I offered to meet my noble friend.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

I am very grateful and I beg leave to withdraw the amendment.

--- Later in debate ---
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

Automated processing could do that. However, with the appropriate safeguards we have put in the Bill, we do not think that it will.

Amendment 77 seeks to define a significant decision as including a decision that has legal or similar effects for the data subject or a group sharing one of the nine protected characteristics under the Equality Act 2010 to which the data subject belongs.

We agree that all forms of discrimination, including discriminatory profiling via the use of algorithms and automated processing, are fundamentally wrong. However, we note that the Equality Act already provides a safeguard for individuals against being profiled on the basis of a particular protected characteristic they possess. Furthermore, recital 71 of the GDPR states that data controllers must ensure that they use appropriate mathematical or statistical procedures to ensure that factors which result in inaccuracies are minimised, and to prevent discriminatory effects on individuals,

“on the basis of racial or ethnic origin, political opinion, religion or beliefs, trade union membership, genetic or health status or sexual orientation”.

We therefore do not feel that further provision is needed at this stage.

Amendment 77A, in the name of the noble Lord, Lord Stevenson, seeks to require a data controller who makes a significant decision based on automated processing to provide meaningful information about the logical and legal consequences of the processing. Amendment 119, as I understand it, talks to a similar goal, with the added complication of driving a wedge between the requirements of the GDPR and applied GDPR. Articles 13 and 14 of the GDPR, replicated in the applied GDPR, already require data controllers to provide data subjects with this same information at the point the data is collected, and whenever it is processed for a new purpose. We are not convinced that there is much to be gained from requiring data controllers to repeat such an exercise, other than regulatory burden. In fact, the GDPR requires the information earlier, which allows the data subject to take action earlier.

Similarly, Amendment 77B seeks to ensure that data subjects who are the subject of automated decision-making retain the right to make a complaint to the commissioner and to access judicial remedies. Again, this provision is not required in the Bill, as data subjects retain the right to make a complaint to the commissioner or access judicial remedies for any infringement of data protection law.

Amendment 78 would confer powers on the Secretary of State to review the operational effectiveness of article 22 of the GDPR within three years, and lay a report on the review before Parliament. This amendment is not required because all new primary legislation is subject to post-legislative scrutiny within three to five years of receiving Royal Assent. Any review of the Act will necessarily also cover the GDPR. Not only that, but the Information Commissioner will keep the operation of the Act and the GDPR under review and will no doubt flag up any issues that may arise on this or other areas.

Amendment 153A would place a requirement on the Information Commissioner to investigate, keep under review and publish guidance on several matters relating to the use of automated data in the health and social care sector in respect of the terms on which enterprises gain consent to the disclosure of the personal data of vulnerable adults. I recognise and share noble Lords’ concern. These are areas where there is a particular value in monitoring the application of a new regime and where further clarity may be beneficial. I reassure noble Lords that the Information Commissioner has already contributed significantly to GDPR guidance being developed by the health sector and continues to work closely with the Government to identify appropriate areas requiring further guidance. Adding additional prescriptive requirements in the Bill is unlikely to help them shape that work in a way that maximises its impact.

As we have heard, Amendment 183 would insert a new clause before Clause 171 stating that public bodies who profile a data subject should inform the data subject of their decision. This is unnecessary as Clauses 13 and 48 state that when a data controller has taken a decision based solely on automated processing, they must inform the data subject in writing that they have done so. This includes profiling. Furthermore, Clauses 13 and 48 confer powers on the Secretary of State to make further provisions to provide suitable measures to safeguard a data subject’s rights and freedoms.

I thank noble Lords for raising these important issues, which deserve to be debated. I hope that, as a result of the explanation in response to these amendments, I have been able to persuade them that there are sufficient safeguards in relation to automated decision-making in the GDPR and Parts 2 to 4 of the Bill, and that their amendments are therefore unnecessary. On that basis, I invite noble Lords not to press their amendments.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I rather hope that the Minister has not been able to persuade noble Lords opposite. Certainly, I have not felt myself persuaded. First, on the point about “solely”, in recruiting these days, when big companies need to reduce a couple of thousand applications to 100, the general practice is that you put everything into an automated process—you do not really know how it works—get a set of scores at the end and decide where the boundary lies according to how much time you have to interview people. Therefore, there is human intervention—of course there is. You are looking at the output and making the decision about who gets interviewed and who does not. That is a human decision, but it is based on the data coming out of the algorithm without understanding the algorithm. It is easy for an algorithm to be racist. I just googled “pictures of Europeans”. You get a page of black faces. Somewhere in the Google algorithm, a bit of compensation is going on. With a big algorithm like that, they have not checked what the result of that search would be, but it comes out that way. It has been equally possible to carry out searches, as at various times in the past, which were similarly off-beam with other groups in society.

When you compile an algorithm to work with applications, you start off, perhaps, by looking at, “Who succeeds in my company now? What are their characteristics?”. Then you go through and you say, “You are not allowed to look at whether the person is a man or a woman, or black or white”, but perhaps you are measuring other things that vary with those characteristics and which you have not noticed, or some combinations. An AI algorithm can be entirely unmappable. It is just a learning algorithm; there is no mental process that a human can track. It just learns from what is there. It says, “Give me a lot of data about your employees and how successful they are and I will find you people like that”.

At the end of the day, you need to be able to test these algorithms. The Minister may remember that I posed that challenge in a previous amendment to a previous Bill. I was told then that a report was coming out from the Royal Society that would look at how we should set about testing algorithms. I have not seen that report, but has the Minister seen it? Does he know when it is coming out or what lines of thinking the Royal Society is developing? We absolutely need something practical so that when I apply for a job and I think I have been hard done by, I have some way to do something about it. Somebody has to be able to test the algorithm. As a private individual, how do you get that done? How do you test a recruitment algorithm? Are you allowed to invent 100 fictitious characters to put through the system, or should the state take an interest in this and audit it?

We have made so much effort in my lifetime and we have got so much better at being equal—of course, we have a fair way to go—doing our best continually to make things better with regard to discrimination. It is therefore important that we do not allow ourselves to go backwards because we do not understand what is going on inside a computer. So absolutely, there has to be significant human involvement for it to be regarded as a human decision. Generally, where there is not, there has to be a way to get a human challenge—a proper human review—not just the response, “We are sure that the system worked right”. There has to be a way round which is not discriminatory, in which something is looked at to see whether it is working and whether it has gone right. We should not allow automation into bits of the system that affect the way we interact with each other in society. Therefore, it is important that we pursue this and I very much hope that noble Lords opposite will give us another chance to look at this area when we come to Report.

--- Later in debate ---
Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, clearly the Royal Society has been talking to other people. I hope that someone from there is listening and will be encouraged to talk to me too. I am delighted with this amendment and think it is an excellent idea, paired with Amendment 77A, which gives individuals some purchase and the ability to know what is going on. Here we have an organisation with the ability to do something about it, not by pulling any levers but by raising enough of a storm and finding out what is going on to effect change. Amendments 77A and 78A are a very good answer to the worries we have raised in this area.

It is important that we have the ability to feel comfortable and to trust—to know that what is going on is acceptable to us. We do not want to create divisions, tensions and unhappiness in society because things are going on that we do not know about or understand. As the noble Lord said, the organisations running these algorithms do not share our values—it is hard to see that they have any values at all other than the pleasures of the few who run them. We should not submit to that. We must, in all sorts of ways, stand up to that. There are many ways in which these organisations have an impact on our lives, and we must insist that they do that on our terms. We are waking up quite slowly. To have a body such as this, based on principles and ethics and with a real ability to find out what is going on, would be a great advance. It would give me a lot of comfort about what is happening in this Bill, which otherwise is just handing power to people who have a great deal of power already.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

My Lords, the noble Lord, Lord Stevenson, has raised the important issue of data ethics. I am grateful to everyone who has spoken on this issue tonight and has agreed that it is very important. I assure noble Lords that we agree with that. We had a debate the other day on this issue and I am sure we will have many more in the future. The noble Lord, Lord Puttnam, has been to see me to talk about this, and I tried to convince him then that we were taking it seriously. By the sound of it, I am not sure that I completely succeeded, but we are. We understand the points he makes, although I am possibly not as gloomy about things as he is.

We are fortunate in the UK to have the widely respected Information Commissioner to provide expert advice on data protection issues—I accept that that advice is just on data protection issues—but we recognise the need for further credible and expert advice on the broader issue of the ethical use of data. That is exactly why we committed to setting up an expert advisory data ethics body in the 2017 manifesto, which, I am glad to hear, the noble Lord, Lord Clement-Jones, read carefully.

Data Protection Bill [HL]

Debate between Lord Ashton of Hyde and Lord Lucas
Committee: 3rd sitting (Hansard - continued): House of Lords
Monday 13th November 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Data Protection Act 2018 View all Data Protection Act 2018 Debates Read Hansard Text Read Debate Ministerial Extracts Amendment Paper: HL Bill 66-IV Fourth marshalled list for Committee (PDF, 151KB) - (13 Nov 2017)
Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, I support Amendment 79. I offer as an example the national pupil database, which the Department for Education makes available. It is very widely used, principally to help improve education. In my case, I use it to provide information to parents via the Good Schools Guide; in many other cases it is used as part of understanding what is going on in schools, suggesting where the roots of problems might lie, and how to make education in this country better. That does not fall under “scientific or historical” and is a good example of why that phrase needs widening.

Lord Ashton of Hyde Portrait The Parliamentary Under-Secretary of State, Department for Digital, Culture, Media and Sport (Lord Ashton of Hyde) (Con)
- Hansard - -

My Lords, as a non-lawyer, I am delighted to find myself in the same company as the noble and learned Lord, Lord Hope of Craighead, as this has also introduced me to an area of trust law which I am not familiar with. I thank noble Lords for their amendments, which concern the exemptions from data rights in the GDPR that the Bill creates. Two weeks ago we debated amendments that sought to create an absolute right to data protection. Today we will further debate why, in some circumstances, it is essential to place limitations on those rights.

The exemptions from data rights in the GDPR are found in Schedules 2 to 4 to the Bill. Part 6 of Schedule 2 deals with exemptions for scientific or historical research and archiving. Without these exemptions, scientific research which involves working on large datasets would be crippled by the administration of dealing with requests from individuals for their data and the need to give notice and service other data rights. This data provides the fuel for scientific breakthroughs, which the noble Lord, Lord Patel, and others have told us so much about in recent debates.

Amendment 79 seeks to remove “scientific or historical” processing from the signposting provision in Clause 14. Article 89 of the GDPR is clear that we may derogate only in relation to specifically historical or scientific research. We believe that Clause 14 needs to correctly describe the available exemption, although I reassure noble Lords that, as we have discussed previously, these terms are to be interpreted broadly, as outlined in the recitals.

Part 1 of Schedule 2 deals with exemptions relating to crime, tax and immigration. For example, where the tax authorities assess whether tax has been correctly paid or criminally evaded, that assessment must not be undermined by individuals accessing the data being processed by the authority. Amendments 79A and 79B, spoken to by the noble Lord, Lord Griffiths of Burry Port, would limit the available exemptions by removing from the list of GDPR rights that can be disapplied the right to restrict processing and the right to object to processing. In my example, persons subject to a tax investigation would be able to restrict and object to the processing by a tax authority. Clearly that is not desirable.

Amendments 80A and 83A seek to widen the exemption in paragraph 5(3) of Schedule 2 which exempts data controllers from complying with certain data rights where that data is to be disclosed for the purposes of legal proceedings. Without this provision, which mirrors the 1998 Act, individuals may be able to unfairly disrupt legal proceedings by blocking the processing of data. We are aware that the Bar Council has suggested that the exemption be widened as the amendments propose. This would enable data controllers to be wholly exempt from the relevant data rights. We believe that this is too wide and that the exemption should apply only where the data is, or will be, subject to a disclosure exercise, which is a process managed through court procedure rules. At paragraph 17 of Schedule 2, the Bill makes separate provision for exemptions to protect legal professional privilege. We think that the Bill continues to strike the right balance between the rights of data subjects and controllers processing personal data for the purposes of exercising their legal rights.

Amendment 83B seeks to remove paragraph 7 of Schedule 2 from the Bill. This paragraph sets out the conditions for restricting data subjects’ rights in respect of personal data processed for the purposes of protecting the public. Those carrying out functions to protect the public would include bodies and watchdogs concerned with protecting the public from incompetence, malpractice, dishonesty or seriously improper conduct, securing the health and safety of persons at work and protecting charities and fair competition in business. Paragraph 7, which is based on the current Section 31 of the 1998 Act, ensures that important investigations can continue without interference. Without this paragraph, persons would have to be given notice that they were being investigated and, on receipt of notice, they could require their data to be deleted, frustrating the investigation.

Paragraph 14 of Schedule 2 allows a data controller to refuse to disclose information to the data subject where doing so would involve disclosing information relating to a third party. Amendment 86A would remove the circumstances set out in sub-paragraph (3) to which a data controller must have regard when determining whether it is reasonable to disclose information relating to a third party without their consent. These considerations mirror those in the 1998 Act and we think that they remain important matters to be considered when determining reasonableness. They also allow for any duty of confidentiality to be respected.

Paragraph 15 of Schedule 2 ensures that an individual’s health, education or social work records cannot be withheld simply because they make reference to the health, education and social work professionals who contributed to them. Amendment 86B would allow a controller to refuse to disclose an individual’s health records to that individual on the grounds that they would identify the relevant health professionals who authored them. We believe that individuals should be able to access their health records in these circumstances.

Data Protection Bill [HL]

Debate between Lord Ashton of Hyde and Lord Lucas
Monday 6th November 2017

(6 years, 6 months ago)

Lords Chamber
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

I said that we believe that the term is sufficiently broad to cover processing that would have been permitted hitherto, which the noble Earl refers to. However, of course, if we have got it wrong and if the insurance industry has a point it wants to bring up, it would be sensible, and I would be delighted, to meet him and the industry to discuss that. As I said before, we have an open mind, so I will certainly do that.

On the provisions in paragraphs 2 and 3 of Schedule 1 on health and social care, and public health, respectively, which are the focus of Amendments 27 to 29, it is fair to say that the drafting here has moved on slightly from the approach taken in Schedule 3 to the 1998 Act. However, article 9(2)(h) of the GDPR refers specifically to processing which is necessary for,

“the assessment of the working capacity of an employee”,

and,

“the management of health … care systems”.

Article 9(2)(i) refers specifically to processing which is,

“necessary for reasons of public interest in the area of public health”.

The purpose of paragraphs 2 and 3 of Schedule 1 is to give these GDPR provisions legislative effect. To remove these terms from the clause by virtue of Amendments 27 to 29 would mean that healthcare providers might have no lawful basis to process special categories of data for such purposes after 25 May. I am sure that noble Lords would agree that that would be unwelcome.

The noble Lord, Lord Kennedy, asked some questions on paragraph 2 and asked for an example of data processed under paragraph 2(b). An example would be occupational health. The wording of paragraph 2(2)(f) of Schedule 1 is imported from article 9(2)(h), and I refer the noble Lord—I am sure that he has remembered it—to the exposition given in recital 53.

Paragraph 4—the focus of Amendments 32 to 34—provides for the processing of special categories of data for purposes relating to archiving and research. The outcome of these amendments would be to name specific areas of research and types of records. The terms “scientific research” and “archiving” cover a wide range of activities. Recital 157 to the GDPR specifically refers to “social science” in the context of scientific research, and recital 159 makes it clear that,

“scientific research purposes should be interpreted in a broad manner including for example technological development and demonstration, fundamental research, applied research and privately funded research”.

The Government are not aware of anything in the GDPR or the Bill which casts doubt on the application of these terms to social science research or digital archiving.

Finally, on the important issue of confidentiality, Amendments 31 and 70 are unnecessary, because all health professionals are subject to the common-law duty of confidentiality. The duty is generally understood to mean that, if information is given in circumstances where it is expected that a duty of confidence applies, that information cannot normally be disclosed without the information provider’s consent. However, beyond relying on the common-law duty of confidentiality, health professionals and social work professionals are bound by the requirements in their employee contract to uphold rules on confidentiality, whether that information is held on paper, computer, visually or audio recorded, or even held in the memory of the professional. Health professionals and social work professionals as defined in Clause 183 are all regulated professionals.

I can therefore reassure the noble Lord, Lord Kakkar—I am also grateful to the noble Lord, Lord Lester, for his support with regard to the Human Rights Act—that the Government strongly agree on the importance of the common-law duty of medical confidentiality but also recognise that it is not absolute. For example, there already are, and will continue to be, instances where disclosure of personal data by a medical professional is necessary for important public interest purposes, such as certain crime prevention purposes or pursuant to a court order. I therefore cannot agree to Amendment 108A, although, as we have already said, the Government are committed to looking at the issue of delegated powers in the round. I will certainly include that in that discussion. Therefore, with that reassurance, I ask the noble Lord to withdraw his amendment.

Lord Lucas Portrait Lord Lucas
- Hansard - - - Excerpts

My Lords, might I beg a meeting of the Minister to discuss the matter of suicidal students at university and how that will be handled under the new legislation as it is developed? This need not necessarily fit within the timescale of the Bill, but I would very much like to be able to understand policy on it and to involve universities in moving from the current unsatisfactory position.

Lord Ashton of Hyde Portrait Lord Ashton of Hyde
- Hansard - -

It is always a pleasure to meet my noble friend, and I am happy to do that.