Data Protection and Digital Information (No. 2) Bill

Carol Monaghan Excerpts
2nd reading
Monday 17th April 2023

(1 year, 7 months ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- View Speech - Hansard - -

We can all agree that the free flow of personal data across borders is essential to the economy, not just within the UK but with other countries, including our biggest trading partner, the EU. Reforms to our data protection framework must have appropriate safeguards in place to ensure that we do not put EU-UK data flows at risk.

Despite the Government’s promises of reforms to empower people in the use of their data, the Bill instead threatens to undermine privacy and data protection. It potentially moves the UK away from the “adequacy” concept in the EU GDPR, and gives weight to the idea that different countries can maintain data protection standards in different but equally effective ways. The only way that we can properly maintain standards is by having a standard across the different trading partners, but the Bill risks creating a scenario where the data of EU citizens could be passed through the UK to countries with which the EU does not have an agreement. The changes are raising red flags in Europe. Many businesses have spoken out about the negative impacts of the Bill’s proposals. Many of them will continue to set their controls to EU standards and operate on EU terms to ensure that they can continue to trade there.

According to conservative estimates, the loss of the adequacy agreement could cost £1.6 billion in legal fees alone. That figure does not include the cost resulting from disruption of digital trade and investments. The Open Rights Group says:

“Navigating multiple data protection regimes will significantly increase costs and create bureaucratic headaches for businesses.”

Although I understand that the Bill is an attempt to reduce the bureaucratic burden for businesses, we are now potentially asking those businesses to operate with two different standards, which will cause them a bigger headache. It would be useful if the Government confirmed that they have sought legal advice on the adequacy impact of the Bill, and that they have confirmed with EU partners that the EU is content that the Bill and its provisions will not harm EU citizens or undermine the trade and co-operation agreement with the EU.

Several clauses of the Bill cause concern. We need more clarity on those that expand the powers of the Home Secretary and the police, and we will require much further discussion on them in Committee. Given what has been revealed over the past few months about the behaviour of some members of the Metropolitan police, there are clauses in the Bill that should cause us concern. A national security certificate that would give the police immunity when they commit crimes by using personal data illegally would cause quite a headache for many of us. The Government have not tried to explain why they think that police should be allowed to operate in the darkness, which they must now rectify if they are to improve public trust.

The Bill will also expand what counts as an “intelligence service” for the purposes of data protection law, again at the Home Secretary's discretion. The Government argue that this would create a “simplified” legal framework, but, in reality, it will hand massive amounts of people’s personal information to the police. This could include the private communications as well as information about an individual’s health, political belief, religious belief or sex life.

The new “designation notice” regime would not be reviewable by the courts, so Parliament might never find out how and when the powers have been used, given that there is no duty to report to Parliament. The Home Secretary is responsible for both approving and reviewing designation notices, and only a person who is “directly affected” by a such a notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, meaning that even those affected would not know it and therefore could not possibly challenge it.

These are expansive broadenings of the powers not only of the Secretary of State, but of the police and security services. If the UK Government cannot adequately justify these powers, which they have not done to date, they must be withdrawn or, at the very least, subject to meaningful parliamentary oversight.

Far from giving people greater power over their data, the Bill will stop the courts, Parliament and individuals from challenging illegal uses of data. Under the Bill, organisations can deny or charge a fee to individuals for the right to access information. The right hon. Member for New Forest East (Sir Julian Lewis) mentioned the difficulty he had with a constituent. I think we can all have some sympathy with that, because many of us have probably experienced similar requests from members of the public. However, it is the public’s right to have access to the data that we hold. If an organisation decides that these requests are “vexatious or excessive”, they can refuse them, but what is “vexatious or excessive”? These words are vague and open to interpretation. Moreover, charging a fee will create a barrier for some people, particularly those on lower incomes, and effectively restricts control of data to more affluent citizens.

The Bill changes current rules that prevent companies and the Government from making solely automated decisions about individuals that could have legal or other significant effects on their lives. We have heard a lot about the potential benefits of AI and how it could be used to enhance our lives, but for public trust and buy-in of AI, we need to know that there is some oversight. Without that, there will always be a question hanging over it. The SyRI case in the Netherlands involved innocuous datasets such as household water usage being used by an automated system to accuse individuals of benefit fraud.

The Government consultation response acknowledges that, for respondents,

“the right to human review of an automated decision was a key safeguard”.

But despite the Government acknowledging the importance of a human review in an automated decision, clause 11, if implemented, would mean that solely automated decision making is permitted in a wider range of contexts. Many of us get excited about AI, but it is important to acknowledge that AI still makes mistakes.

The Bill will allow the Secretary of State to approve international transfers to countries with weak data protection, so even if the Bill does not make data security in the UK weaker, it will weaken the protections of UK citizens’ data by allowing it to be transferred abroad in cases with lower safeguards.

It is useful to hear a couple of stakeholder responses. The Public Law Project has said:

“The Data Protection and Digital Information (No.2) Bill would weaken important data protection rights and safeguards, making it more difficult for people to know how their data is being used”.

The Open Rights Group has said:

“The government has an opportunity to strengthen the UK’s data protection regime post Brexit. However, it is instead setting the country on a dangerous path that undermines trust, furthers economic instability, and erodes fundamental rights.”

Since we are talking about a Bill under the Department for Science, Innovation and Technology, it is important to hear from the Royal Society, which says that losing adequacy with the EU would be damaging for scientific research in the UK, creating new costs and barriers for UK-EU research collaborations. While the right hon. Member for Maldon (Sir John Whittingdale) is right about the importance of being able to share data, particularly scientific data—and we understand the importance of that for things such as covid vaccines—we need to make sure this Bill does not set up further hurdles that could prevent that.

There is probably an awful lot for us to thrash out in Committee. The SNP will not vote against Second Reading tonight, but I appeal to those on the Government Front Bench to give an opportunity for hon. Members to amend and discuss this Bill properly in Committee.

Data Protection and Digital Information (No. 2) Bill (First sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (First sitting)

Carol Monaghan Excerpts
Committee stage
Wednesday 10th May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
None Portrait The Chair
- Hansard -

It is a brutal cut-off, I am afraid, at 9.55 am. I have no discretion in this matter. It is a quick-fire round now, gentlemen. We need quick questions and quick answers, with one each from Carol Monaghan, Chi Onwurah and Mike Amesbury.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

Q Clause 40 sets out the criteria by which a data controller can refuse data access requests. Do you think this is appropriate? Are you concerned that it may lead to a situation in which only those who can afford to pay a potential fee will be able to access their data?

John Edwards: Yes and no. Yes, I do believe it is an adequate provision, and no, I do not believe there will be an economic barrier to people accessing their information rights.

Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Q The Bill’s intent is to reduce burdens on organisations while maintaining high data protection standards. Do you agree that high data protection standards are promoted by well-informed and empowered citizens? What steps do you think the Bill takes to ensure greater information empowerment for citizens?

John Edwards: Yes, I do believe that an empowered citizenry is best placed to enjoy these rights. However, I also believe that the complexity of the modern digital environment creates such an information asymmetry that it is important for strong advocates such as the Information Commissioner’s Office to act as a proxy on behalf of citizenry. I do not believe that we should devolve responsibility to citizens purely to ensure that high standards are set and adhered to in digital industries.

--- Later in debate ---
None Portrait The Chair
- Hansard -

We can see Mr Ustaran and Ms Bellamy and they can hear us, but we cannot hear them, so we will carry on with questioning Vivienne Artz.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q A number of organisations have expressed concerns about moving to a situation in which we can refuse subject access requests or indeed charge a fee. Do you believe the thresholds in the Bill are appropriate and proportionate?

Vivienne Artz: I do think the thresholds are appropriate and proportionate. In practice, most organisations do not actually choose to charge, because actually it costs more to process the cheque than it is worth in terms of the revenue. Certainly, some sectors have been subject to very vexatious approaches through claims-management companies and others, where it is a bombarding exercise and it is unclear whether it is in the best interests of the consumers, or whether it is at their understanding and behest, to make a genuine subject access request.

I am a great supporter of subject access requests—they are a way for individuals to exercise their rights to understand what data is being processed—but as a result of quirks of how we operate often in the UK, they are being used as a pre-litigation investigative tool on the cheap, which is unfortunate and has meant that we have had to put in place additional safeguards to ensure they are used for the purpose for which they were provided, which is so that individuals can have transparency and clarity around what data is being processed and by whom.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Do you think the threshold for something to be considered vexatious or excessive is well understood?

Vivienne Artz: We have heard from the Information Commissioner that they are fairly clear on what that terminology means and it will reflect the existing body of law in practice. I will be perfectly honest: it is not immediately clear to me, but there is certainly a boundary within which that could be determined, and that is something we would rely on the Information Commissioner to provide further guidance on. It is probably also likely to be contextual.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q How frequently do we expect such requests to be refused off the back of this legislation?

Vivienne Artz: I think it depends on the sector. I come from the financial services sector, so the types of subject access requests we get tend to be specific to us. I think organisations are going to be reluctant to refuse a subject access request because, at the end of the day, an individual can always escalate to the Information Commissioner if they feel they have been unfairly treated. I think organisations understand their responsibility to act in the best interests of the individual at all times.

None Portrait The Chair
- Hansard -

Q Ms Bellamy and Mr Ustaran, we can now hear both of you. Would you be kind enough to introduce yourselves?

Bojana Bellamy: Thank you for inviting me to this hearing. My name is Bojana Bellamy. I lead the Centre for Information Policy Leadership. We are a global data privacy and data policy think-and-do-tank operating out of London, Brussels and Washington, and I have been in the world of data privacy for almost 30 years.

Eduardo Ustaran: Good morning. My name is Eduardo Ustaran. I am a partner at Hogan Lovells, based in London, and I co-lead our global privacy and cyber-security practice, a team of over 100 lawyers who specialise in data protection law all over the world.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Q Thank you. Mr Combemale, will you set out some of the obstacles for your organisation, and how you would like the Bill to reduce them?

Chris Combemale: I think the single biggest one that has troubled our members since the implementation of GDPR is the issue around legitimate interest, which was raised by the hon. Member for Folkestone and Hythe. The main issue is that GDPR contains six bases of data processing, which in law are equal. For the data and marketing industry, the primary bases are legitimate interest and consent. For some reason it has become widely accepted through the implementation of GDPR that GDPR requires consent for marketing and for community activities. I am sure that you hear in your constituencies of many community groups that feel that they cannot go about organising local events because they must have consent to communicate. That has never been the intention behind the legislation; in fact, the European Court of Justice has always ruled that any legal interest could be a legitimate interest, including advertising and marketing.

If you look at what we do, which is effectively finding and retaining customers, the GDPR legislation says in recital 4 that privacy is a fundamental right, not an absolute right, and must be balanced against other rights, such as the right to conduct a business. You cannot conduct a business without the right to find and retain customers, just as you cannot run a charity without the right to find donors and volunteers who provide the money and the labour for your good cause. The clarification is really important across a wide range of use cases in the economy, but particularly ours. It was recognised in GDPR in recital 47. What the legislation does is give illustrative examples that are drawn from recitals 47, 48 and 49. They are not new examples; they are just given main text credibility. It is an illustrative list. Really, any legal interest could be a legitimate interest for the purpose of data providing, subject to necessity and proportionality, which we discussed earlier with the Information Commissioner.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q We have heard already this morning that a number of words and phrases could have some ambiguity associated with them, such as the word “excessive”, and the Bill allowing certain cookies that are “low risk”. Do you think that the phrase “low risk” is well enough understood?

Chris Combemale: In the sector that I represent, we have a fairly clear understanding of the gradients of risk. As I was saying earlier, many companies do not share data with other companies. They are interested solely in the relationships that they have with their existing customers or prospects. In that sense, all the customer attitudes to privacy research that we do indicates that people are generally comfortable sharing data with companies they trust and do business with regularly.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Would that then be the definition of low risk?

Chris Combemale: I would not want to suggest what the legal definition is. To us in direct marketing and in the Data and Marketing Association, existing customer relationships—loyal customers who trust and are sometimes passionate about the brands they interact with—are low risk. Higher risk is when you come to share data with other companies, but again much of that activity and data sharing is essential to creating relevance. With the right protections, it is not a hugely high-risk activity. Then you can move on up, so the higher the degree of automation and the higher the degree of third-party data, the greater the risk, and you have to put in place mitigations accordingly. I am not a lawyer—I am just a poor practitioner—so I cannot define it from a legal point of view, but it is clear in the context of our industry how risk elevates depending on what you are doing.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q I might come back to that in a second, but I think Neil wanted to add something.

Neil Ross: I was going to say that you can see how Chris has interpreted it through the lens of his industry, but the feedback we have had from our members, who operate across a range of industries, suggests that there is quite a lot of confusion about what that terminology might mean. The rest of the Bill aims to clarify elements of the GDPR and put them on the face of the Bill, but this provision seems to be going in the other direction. It raises concern and confusion.

That is why our approach has always been that you are going to get more clarity by aligning the Privacy and Electronic Communications Regulation 2003 more with the GDPR, which has clear legal bases, processes and an understanding of what is high and low risk—a balancing test, and so on—than through this fairly broad and poorly understood term “low risk”. We have concerns about how it will operate across a range of sectors.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Chris, you said that you are not a lawyer and cannot define what low risk is, but there will of course have to be some sort of definition. Have we captured that well enough?

Chris Combemale: Coming back to our discussion about legitimate interest and the proportionality balancing test, or legitimate interest impact assessments, when you are thinking about what you are planning to do with your customers, it is a requirement of good marketing without the legislation, but also within the legislation, to think about how what you are planning to do will impact your customers’ privacy, and then to mitigate. The important thing is not to say, “There’s no risk,” “It is low risk,” or “It is high risk”; it is to understand that the higher the risk, the greater the mitigations that you have to put in place. You may conclude that you should not do something because the risk level is too high. That is what balancing tests do, and decisions and outcomes result from them.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q The potential difficulty here is that the responsibility is being put on the company. You have described a responsible company that categorises levels of risk and takes action accordingly. Without a clear definition, if it were a less scrupulous company, would there be a grey area?

Chris Combemale: We do a lot of work combating rogue traders, and we provide evidence to cases from our work with the telephone preference service and other activities. Rogue traders—especially those with criminal intent—will generally ignore the legislation anyway regardless of what you do and whether it lacks clarity or not, but I think you are right. An important part of GDPR is that it puts a lot of responsibility on companies to consider their particular activity, their particular customer base and the nature of their audience. Age UK, a charity that has a lot of vulnerable elderly customers, has to have greater protections and put more thought into how it is doing things than a nightclub marketing to under-30s, who are very technologically literate and digitally conversant.

When we do customer attitudes to privacy studies, we see three broad segmentations—data unconcerned, data pragmatist and data fundamentalist—and they require different treatment. It is incumbent on any company, in a marketing context, to understand who their audience and their customer base is, and design programmes appropriately to build trust and long-term relationships over time. That is an important element of GDPR, from a marketer’s perspective. I should add that it should not take legislation to force marketers to do that.

None Portrait The Chair
- Hansard -

There are five minutes left and there are two Members seeking to ask questions.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Mr Birtwistle?

Michael Birtwistle: I very much agree with my other panellists on those points. If you are thinking about concrete ways to improve what is in the Bill, the high level of protection around automated decision making is currently in article 22B. That looks at decisions using special category data, which, as an input, you could also add in there, looking at the output. You could include decisions that involve high-risk processing, which is already terminology used throughout the Bill. That would mean that, where automated decision making is used around decisions that involve high-risk processing, you would need meaningful human involvement, explicit consent or substantial public interest.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Jeni, can I come back to you on automated decision making? You have suggested that a requirement to notify people when an automated decision is made about them would be a useful inclusion in the Bill. Do you think enough consideration has been given to that?

Dr Tennison: The main thing that we have been arguing for is that it should be the wider set of decision subjects, rather than data subjects, who get rights relating to notification, or who can have a review. It is really important that there be notification of automated decision making, and as much transparency as possible about the details of it, and the process that an organisation has gone through in making an impact assessment of what that might mean for all individuals, groups and collective interests that might be affected by that automated decision making.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q We can probably broadly split these decisions into two categories. Decisions are already being made by algorithms online, according to what we are looking at. If I look up a paint colour online, and then start getting adverts for different paint companies, I am not too worried about that. I am more concerned that decisions could be made in the workplace about me, or about energy tariffs, as we have heard. That is more serious. Is there a danger that if we notify individuals of all the automated decisions that are made, it will end up like the cookie scenario—we will just ignore it all?

Dr Tennison: I do not think it is a matter of notifying people about all automated decision making. The Bill suggests limiting that to legally or otherwise significant decisions, so that we have those additional rights only as regards things that will really have an impact on people’s lives.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q And you are not comfortable that those have been considered properly in the Bill.

Dr Tennison: I am not comfortable that they are directed to the right people.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q The subject, rather than the decision maker.

Dr Tennison: Yes.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Anna, did you want to come in on that?

Anna Thomas: The last question about the threshold is really important, and it tends to suggest that work should have separate consideration, which is happening all over the world. Last week, Canada introduced its automated decision-making directive, and extended it to work. We have been working with it on that. Japan has a strategy that deals expressly with work. In the United States there are various examples, including the California Privacy Rights Act, of rules that give work special attention in this context. Our proposal for addressing the issue of threshold is that you should always provide notification, assess, and do your best to promote positive impacts and reduce negative ones if the decision-making impacts access to work, termination, pay, contractual status or terms, and, for the rest, when there is significant impact.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q Is there a danger that automated decisions could impact the Equality Act, if biases are not properly accounted for?

Anna Thomas: Yes, absolutely. In our model, we suggest that the impact assessment should incorporate not just the data protection elements, which we say remain essential, but equality of opportunity and disparity of outcome—for example, equal opportunity to promotion, or access to benefits. That should be incorporated in a model that forefronts and considers impacts on work.

Mike Amesbury Portrait Mike Amesbury
- Hansard - - - Excerpts

Q Anna, how would you strengthen the Bill? If you were to table an amendment around employees and AI, what would it be?

Anna Thomas: I would advise very clear additional rights, and a duty to notify in advance what, how and why AI is being used where it has these impacts, and where it meets the threshold that I was just asked about. I would also advise having more consultation throughout design, development and deployment, and ongoing monitoring, because AI changes, and there are impacts that we have not thought about or cannot ascertain in advance.

There should also be a separate obligation to conduct an algorithmic impact assessment. The Bill does nudge in that direction, but it says that there should be an assessment, rather than a data protection impact assessment. We suggest that the opportunity be grasped of clarifying that—at least in the workplace context, but arguably there are lessons more widely—the assessment ought to cover these fundamental aspects, and impacts at work.

Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Second sitting)

Carol Monaghan Excerpts
Committee stage
Wednesday 10th May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

Q My final question is to all the witnesses. What are your views on the reforms to the ICO and their potential impact on its independence from Government?

Ms Irvine: We have concerns about the proposed changes and their potential impact on the independence of the Information Commissioner. I was able to listen to John Edwards speaking this morning, and I noted that he did not share those concerns, which I find surprising. The ICO is tasked with producing statutory codes of conduct, which are incredibly useful for my clients and for anyone working in this sector. The fact that the Secretary of State can, in effect, overrule these is concerning, and it must be seen as a limit on the Information Commissioner’s independence.

That leads to a concern that we have in relation to the adequacy decision that is in place between the EU and the United Kingdom. Article 52 of the GDPR states very clearly that a supervisory authority must have clear independence. The provisions relating to the independence of the Commission—the potential interference of the Secretary of State in law is enough to undermine independence—are therefore of concern to us.

Alexandra Sinclair: We would just say that it is not typical for an independent regulator to have its strategic objectives set by a Minister, and for a Minister to set those priorities without necessarily consulting. We consider that the ICO, as subject matter experts, are probably best placed to do that.

Jacob Smith: From our perspective, the only thing to add is that one way to improve the clauses on national security certificates and designation notices would be to give the ICO an increased role in oversight and monitoring, for instance. Obviously, if there are concerns about its independence, we would want to consider other mechanisms.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

Q Laura Irvine, in your briefing about the Bill you raised concerns about some of the language. We had some discussion this morning about the language and particular terms, such as what “vexatious” means, for example. Could you elaborate on your concerns?

Ms Irvine: Certainly. There are terms that have been used in data protection law since the 1984 Act. They were used again in the 1998 Act, echoed under the GDPR and included in all the guidance that has come from the Information Commissioner’s Office over the past number of years. In addition to that, there is case law that has interpreted many of those terms. Some of the proposed changes in the Bill introduce unexpected and unusual terms that will require interpretation. Even then, once we have guidance from the Information Commissioner, that guidance is sometimes not as helpful as interpretation by tribunals and courts, which is pretty sparse in this sector. The number of cases coming through the courts is limited—albeit that there is a lot more activity in the sector than there used to be. It simply presents a lot more questions and uncertainty in certain ways.

For my business clients, that is a great difficulty, and I certainly spend a lot of time advising clients on how I believe a matter—a phrase—will be interpreted, because I have knowledge of how data protection law works in general. That is based on my experience of the power of businesses and organisations, particularly in the third sector. Smaller bodies will often be challenged by a lack of knowledge and expertise, and that is a difficulty of introducing in legislation brand-new terms that are not familiar to practitioners, far less the organisations asked to implement the changes.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Q You also raised concerns about automated decision making. Again, we have heard quite a lot about that today. You talked about a case on automated decision making, with regard to benefit awards being made by local authorities. Can you tell us a bit about that and where the danger might lie here?

Ms Irvine: I expect that you have heard a lot of warnings about safety. I echo what Alexandra said earlier about the removal of the right not to have automated decisions taken by organisations. That is something that we were concerned to see in a society where this is happening more and more. The particular example that we gave came from a study that had been carried out by the Equality and Human Rights Commission. That was looking particularly at decision making in local authorities; at how AI or algorithms were being used to take decisions without enough transparency; and at whether this gave the individuals the right to challenge those decisions, which stems from the transparency that is built in. The challenge for any organisation using any automated decision making—particularly in the public sector, I would submit, where the impact can be extremely significant, particularly if we are talking about benefits—is making sure these organisations understand what the technology is doing, explaining that to individuals and giving them the right to object.

The changes in the Bill relax the restrictions on automated decision making and allow that to happen almost as a default, with safeguards as an add-on, whereas article 22 as currently drafted provides a right not to have automated decisions taken about an individual unless certain circumstances apply. To echo what Alexandra said, when more and more decisions are being made automatically without a human intervening, and certainly without a human intervening at the appropriate stage to prevent damage or harm to individuals, it would absolutely seem like the wrong time to make these changes and relaxations to the regime.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Thank you.

None Portrait The Chair
- Hansard -

You have all been superstars in our 10th panel. Thank you very much indeed for the evidence you have given this afternoon. We will now move on to the next panel.

Examination of Witness

Alex Lawrence-Archer gave evidence.

Data Protection and Digital Information (No. 2) Bill (Third sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Third sitting)

Carol Monaghan Excerpts
Committee stage
Tuesday 16th May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

The impact of clause 9 and the concerns around it should primarily be understood in relation to the definition contained in clause 2, so I refer hon. Members to my remarks in the debate on clause 2. I also refer them to my remarks on purpose limitation in clause 6. To reiterate both in combination, I should say that purpose limitation exists so that it is clear why personal data is being collected, and what the intention is behind its use. That means that people’s data should not largely be reused in ways not initially collected for, unless a new legal basis is obtained.

It is understandable that, where genuine scientific, historical and statistical research is occurring, and there is disproportionate effort to provide the information required to data subjects, there may be a need for exemption and to reuse data without informing the subject. However, that must be done only where strictly necessary. We must be clear that, unless there are proper boundaries to the definition of scientific data, this could be interpreted far too loosely.

I am concerned that, without amendment to clause 2, clause 9 could extend the problem of scientific research being used as a guise for using people’s personal data in malicious or pseudoscientific ways. Will the Minister tell us what protections will be in place to ensure that people’s data is not reused on scientific grounds for something that they would otherwise have objected to?

On clause 10, I will speak more broadly on law enforcement processing later in the Bill, but it is good to have clarity on the legal professional privilege exemptions. I have no further comments at this stage.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

What we are basically doing is changing the rights of individuals, who would previously have known when their data was used for a purpose other than that for which it was collected. The terms

“scientific or historical research, the purposes of archiving in the public interest or statistical purposes”

are very vague, and, according to the Public Law Project, open to wide interpretation. Scientific research is defined as

“any research that can reasonably described as scientific, whether publicly or privately funded”.

I ask the Minister: what protections are in place to ensure that private companies are not given, through this clause, a carte blanche to use personal data for the purpose of developing new products, without the need to inform the data subject?

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

These clauses relate to one of the fundamental purposes of the Bill, which is to facilitate genuine scientific research—obviously, that carries with it huge potential benefits in the areas of tackling disease or other scientific advances. We debated the definition of scientific research earlier in relation to clause 2. We believe that the definition is clear. In this particular case, the use of historical data can be very valuable. It is simply impractical for some organisations to reobtain consent when they may not even know where original data subjects are now located.

Data Protection and Digital Information (No. 2) Bill (Fourth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Fourth sitting)

Carol Monaghan Excerpts
Committee stage
Tuesday 16th May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 16 May 2023 - (16 May 2023)
Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

I rise to speak to my amendment 120. The explanatory notes to the Bill clarify that newly permitted automated decisions will not require the existing legal safeguard of notification, stating only:

“Where appropriate, this may include notifying data subjects after such a decision has been taken”.

Clause 11 would replace article 22 of the GDPR, which regulates AI decision making, with new articles 22A to 22D. According to Connected by Data, it is built on the faulty assumption that the people who are affected by automated decision making are data subjects—identifiable individuals within the data used to make the automated decision. However, now that AI decisions can be based on information about other people, it is becoming increasingly common for algorithms created through training on one set of people to be used to reach conclusions about another set.

A decision can be based on seemingly innocuous information such as someone’s postcode or whether they liked a particular tweet. Where such a decision has an impact on viewing recommendations for an online player, we would probably not be that concerned, but personal data is being used more and more to make decisions that affect whole groups of people rather than identified individuals. We need no reminding of the controversy that ensued when Ofqual used past exam results to grade students during the pandemic.

Another example might be an electricity company getting data from its customers about home energy consumption. Based on that data, it could automatically adjust the time of day at which it offered cheaper tariffs. Everyone who used the electricity company would be affected, whether data about their energy consumption patterns were used to make the decision or not. It is whether an automated decision has a legal or similarly significant effect on an individual that should be relevant to their rights around automated decision making.

Many of the rights and interests of decision subjects are protected through the Equality Act 2010, as the Committee heard in oral evidence last week. What is not covered by other legislation, however, is how data can be used in automated decisions and the rights of decision subjects to be informed about, control and seek redress around automated decisions with a significant effect on them. According to Big Brother Watch:

“This is an unacceptable dilution of a critical safeguard that will not only create uncertainty for organisations seeking to comply, but could lead to vastly expanded ADM operating with unprecedented opacity.”

Amendment 120 would require a data controller to inform a data subject whenever a significant decision about that subject was based solely on automated processing. I am pleased that the hon. Member for Barnsley East has tabled a similar amendment, which I support.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

The Government absolutely share hon. Members’ view of the importance of transparency. We agree that individuals who are subject to automated decision making should be made aware of it and should have information about the available safeguards. However, we feel that those requirements are already built into the Bill via article 22C, which will ensure that individuals are provided with information as soon as is practicable after such decisions have been taken. This will need to include relevant information that an individual would require to contest such decisions and seek human review of them.

The reforms that we propose take an outcome-focused approach to ensure that data subjects receive the right information at the right time. The Information Commissioner’s Office will play an important role in elaborating guidance on what that will entail in different circumstances.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Ms Monaghan, do you wish to move amendment 120 formally?

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

I will not move it formally, Mr Hollobone, but I may bring it back on Report.

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

I beg to move amendment 76, in clause 11, page 19, line 34, at end insert—

“5A. The Secretary of State may not make regulations under paragraph 5 unless—

(a) following consultation with such persons as the Secretary of State considers appropriate, the Secretary of State has published an assessment of the impact of the change to be made by the regulations on the rights and freedoms of data and decision subjects (with particular reference to children),

(b) the Commissioner has reviewed the Secretary of State’s statement and published a statement of the Commissioner’s views on whether the change should be made, with reasons, and

(c) the Secretary of State has considered whether to proceed with the change in the light of the Commissioner’s statement.”

This amendment would make the Secretary of State’s ability to amend the safeguards for automated decision-making set out in new Articles 22A to D subject to a requirement for consultation with interested parties and with the Information Commissioner, who would be required to publish their views on any proposed change.

--- Later in debate ---
Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Of course, the reports on incidents such as those at Fishmongers’ Hall and the Manchester Arena pointed to a general lack of effective collaboration between security forces and the police. It was not data that was the issue; it was collaboration.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I certainly accept that greater collaboration would have been beneficial as well, but there was a problem with data sharing and that is what the clause is designed to address.

As the hon. Member for Barnsley East will know, law enforcement currently operates under part 3 of the Data Protection Act when processing data for law enforcement purposes. That means that even when they work together, law enforcement and the intelligence services must each undertake separate assessments regarding the same joint-working processing.

Data Protection and Digital Information (No. 2) Bill (Fifth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Fifth sitting)

Carol Monaghan Excerpts
Committee stage
Thursday 18th May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 18 May 2023 - (18 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Obviously that is a matter for the Information Commissioner, but that is the overriding principal objective. I am about to set out some of the other objectives that the clause will introduce, but it is made very clear that the principal objective is to ensure the appropriate level of protection. Precisely how the Information Commissioner interprets “appropriate level of protection” is a matter for him, but I think it is fairly clear what that should entail, as he himself set out in his evidence.

As I have said, clause 27 introduces new duties that the commissioner must consider where they are relevant to his work in carrying out data protection functions: the desirability of promoting innovation and competition; the importance of the prevention, investigation, detection and prosecution of criminal offences; the need to safeguard public security and national security; and, where necessary, the need to consult other regulators when considering how the ICO’s work may affect economic growth, innovation and competition. There is also the statement of strategic priorities, which is introduced by clause 28. However, as I have indicated to the hon. Member for Newcastle upon Tyne Central, the commissioner will be clear that his primary focus should be to achieve the principal objective.

Clause 27 also introduces new reporting requirements for the commissioner in relation to the strategic framework. The commissioner will be required to publish a forward-looking strategy outlining how he intends to meet the new principal objective and duties, as well as pre-existing duties in the Deregulation Act 2015 and the Legislative and Regulatory Reform Act 2006.

Finally, the commissioner will be required to publish a review of what he has done to comply with the principal objective, and with the new and existing duties, in his annual report.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

I wonder whether part of the strategy might include a list of fees that could potentially be charged for accessing data. This idea of fees seems to be quite vague in terms of amounts and levels, so it would be useful to have some more information on that.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I think we will come on to some of the questions around the fees that are potentially payable, particularly by those organisations that may be required to provide more evidence, and the costs that that could entail. I will return to that subject shortly.

The new strategic framework acknowledges the breadth of the ICO’s remit and its impact on other areas. We believe that it will provide clarity for the commissioner, businesses and the general public on the commissioner’s objectives and duties. I therefore commend clause 27 to the Committee.

--- Later in debate ---
Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

I will focus most of my remarks on the group on clauses 39 and 40, as clause 41 and schedule 8 contain mostly consequential provisions, as the Minister outlined.

There are two major sections to the clauses. First, they require a complainant to issue their complaint to the controller directly, through allowing the commissioner to refuse to process their complaint otherwise. Secondly, they require the commissioner to refuse any complaint that is vexatious or excessive. I will speak to both in turn.

As the ICO grows and its remit expands, given the rapidly growing use of data in our society, it makes sense that its resources should be focused where they are most needed. Indeed, when giving evidence to the Committee, the Information Commissioner and Paul Arnold of the ICO stated that their current duty to investigate all complaints is creating a burden on their resources. Therefore, the proposal to require that complainants reach out to their data controller first, before contacting the ICO, seems to make sense, as it will allow the regulator to move away from handling low-level complaints, or complaints that are under way but not yet resolved. Instead, it would be able to refocus resources into handling complaints that have been mishandled or that offer a serious threat to data rights and public trust in data use.

Though that may be seen by some businesses and controllers as shifting an extra requirement on to them, the move should be viewed overall as a positive one, as it will require controllers to have clear processes in place for handling complaints and hopefully incentivise against conducting the kind of unlawful processing that prompts complaints in the first place. Indeed, the ICO already encourages that type of best practice, with complainants often encouraged to speak directly with the relevant data controller first before seeking help from the regulator. The clause would therefore simply formalise the arrangement, providing clarity on three levels. First, it would ensure that data subjects are clear on their right to complain directly to the controller. Secondly, it would ensure that controllers are clear on their duty to respond to such complaints. Finally, the ICO would be certain of its ability to refuse a request if the complainant refuses to comply with that model.

Although it is vital that the ICO is able to modernise and direct efforts where they are most needed, it is also vital that a healthy relationship is kept between the public—as data and decision subjects—and the ICO. The public must feel that the commissioner is there to support them in exercising their rights or seeking redress where necessary, not least because lodging a complaint can already be a difficult and distressing process. Indeed, even the commissioner himself said, when he first assumed his role, that he wanted to

“make it easy for people to access remedies if things go wrong.”

As such, it is pleasing to see safeguards built into the clause that ensure a complainant can still escalate their complaint to the ICO, and appeal any refusal from the commissioner to a tribunal.

Data rights groups, such as the Open Rights Group, hold much more serious concerns about the ability to refuse vexatious and excessive requests. Indeed, they worry that the new power will allow the ICO to ignore widespread and systemic abuses of data rights. As was the case with subject access requests, the difference between a complaint made in anger—which is quite likely, given that the complainant believes they have suffered an abuse of their rights—and a vexatious one must be clearly distinguished. The ICO should not be able to reject complaints of data abuses simply because the complainant acts in ways caused by distress.

As the response of the Government to their consultation reveals, only about half of respondents agreed with the proposal to set out criteria by which the ICO can decide not to investigate a complaint. The safeguard to appeal any refusal from the commissioner is therefore crucial in ensuring that there is a clear pathway for data subjects and decision subjects to dispute the decision of the ICO. It is also right that they should be informed of that safeguard, as well as told why their complaint has been refused, and given the opportunity to complain again with a more complete picture of information.

Overall, the clauses seems to strike the right balance between ensuring safeguards for data and decision subjects while helping the ICO to modernise. However, terms such as “vexatious” and “excessive” must be clearly defined to ensure that the ICO is able to exercise this new power of refusal proportionately and sensibly.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

I am looking for some clarification from the Minister. Under clause 39, it says:

“A controller must facilitate the making of complaints…such as providing a complaint form which can be completed electronically and by other means.”

Can the Minister clarify whether every data controller will have to provide an electronic means of making a complaint? For many small data controllers, which would include many of us in the room, providing an electronic means of complaint might require additional expertise and cost that they may not have. If it said, “and/or by other means”, which would allow a data controller to provide a paper copy, that might provide a little more reassurance to data controllers.

John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

Let me address the point of the hon. Member for Glasgow North West first. The intention of the clause is to ensure that complainants go first to the data controller, and the data controller makes available a process whereby complaints can be considered. I certainly fully understand the concern of the hon. Lady that it should not prove burdensome, particularly for small firms, and I do not believe that it would necessarily require an electronic means to do so. If that is not the case, I will tell her, but it seems to me that the sensible approach would be for data controllers to have a process that the Information Commissioner will accept is available to complainants first, before a complaint is possibly escalated to the next stage.

With regard to the point of the hon. Member for Barnsley East, we have debated previously the change in the threshold to “vexatious” and “excessive”, and we may continue to disagree on that matter.

Question put and agreed to.

Clause 39 accordingly ordered to stand part of the Bill.

Clauses 40 and 41 ordered to stand part of the Bill.

Schedule 8 agreed to.

Clause 42

Consequential amendments to the EITSET Regulations

Amendment made: 47, Clause 42, page 72, line 12, at end insert—

“(7A) In paragraph 13 (modification of section 155 (penalty notices)), in sub-paragraph (3)(c), for “for “data subjects”” there were substituted “for the words from “data subjects” to the end”.”.—(Sir John Whittingdale.)

This amendment inserts an amendment of Schedule 2 to the EITSET Regulations which is consequential on the amendment of section 155(3)(c) of the Data Protection Act 2018 by Schedule 4 to the Bill.

Clause 42, as amended, ordered to stand part of the Bill.

Clause 43

Protection of prohibitions, restrictions and data subject’s rights

Question proposed, That the clause stand part of the Bill.

Data Protection and Digital Information (No. 2) Bill (Eighth sitting) Debate

Full Debate: Read Full Debate

Data Protection and Digital Information (No. 2) Bill (Eighth sitting)

Carol Monaghan Excerpts
Committee stage
Tuesday 23rd May 2023

(1 year, 6 months ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 23 May 2023 - (23 May 2023)
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am happy to provide the further detail that the hon. Lady has requested.

Question put and agreed to.

Clause 104 accordingly ordered to stand part of the Bill.

Clause 105

Oversight of biometrics databases

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

I beg to move amendment 123, in clause 105, page 128, line 22, leave out subsections (2) and (3).

None Portrait The Chair
- Hansard -

With this it will be convenient to discuss the following:

Clause stand part.

New clause 17—Transfer of functions to the Investigatory Powers Commissioner’s Office

“The functions of the Surveillance Camera Commissioner are transferred to the Investigatory Powers Commissioner.”

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

Society is witnessing an unprecedented acceleration in the capability and reach of surveillance technologies. Such an acceleration calls for protections and safeguards. Clause 105, however, does the opposite and seeks to abolish both the office of the Surveillance Camera Commissioner and its functions. The explanatory notes to the Bill state that the functions of the office of the Surveillance Camera Commissioner are duplicated and covered by the Information Commissioner’s Office and its CCTV code of practice. That is not the case: the code is advisory only and is primarily concerned with data processes, not with actual surveillance.

Amendment 123 and new clause 17 would retain the functions of the Surveillance Camera Commissioner but transfer them to the Investigatory Powers Commissioner’s Office, thus preserving those necessary safeguards. The IPCO already scrutinises Government activity and deals with the covert use of surveillance cameras, so dealing with overt cameras as well would be a natural extension of its function.

--- Later in debate ---
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

I am grateful to the hon. Members for Glasgow North West and for Barnsley East for the points they have made. The hon. Member for Glasgow North West, in moving the amendment, was right to say that the clause as drafted abolishes the role of the Surveillance Camera Commissioner and the surveillance camera code that the commissioner promotes compliance with. The commissioner and the code, however, are concerned only with police and local authority use in England and Wales. Effective, independent oversight of the use of surveillance camera systems is critical to public trust. There is a comprehensive legal framework for the use of such systems, but the oversight framework is complex and confusing.

The ICO regulates the processing of all personal data by all UK organisations under the Data Protection Act; that includes surveillance camera systems operated by the police and local authorities, and the ICO has issued its own video surveillance guidance. That duplication is confusing for both the operators and the public and it has resulted in multiple and sometimes inconsistent guidance documents covering similar areas. The growing reliance on surveillance from different sectors in criminal investigations, such as footage from Ring doorbells, means that it is increasingly important for all users of surveillance systems to have clear and consistent guidance. Consolidating guidance and oversight will make it easier for the police, local authorities and the public to understand. The ICO will continue to provide independent regulation of the use of surveillance camera systems by all organisations. Indeed, the chair of the National Police Data Board, who gave evidence to the Committee, said that that will significantly simplify matters and will not reduce the level of oversight and scrutiny placed upon the police.

Amendment 123, proposed by the hon. Member for Glasgow North West, would retain the role of the Surveillance Camera Commissioner and the surveillance camera code. In our view, that would simply continue the complexity and duplication with the ICO’s responsibilities. Feedback that we received from our consultation showed broad support for simplifying the oversight framework, with consultees agreeing that the roles and responsibilities, in particular in relation to new technologies, were unclear.

The hon. Lady went on to talk about the oversight going beyond that of the Information Commissioner, but I point out that there is a comprehensive legal framework outside the surveillance camera code. That includes not only data protection, but equality and human rights law, to which the code cross-refers. The ICO and the Equality and Human Rights Commission will continue to regulate such activities. There are other oversight bodies for policing, including the Independent Office for Police Conduct and His Majesty’s inspectorate of constabulary, as well as the College of Policing, which provide national guidance and training.

The hon. Lady also specifically mentioned the remarks of the Surveillance Camera Commissioner about Chinese surveillance cameras. I will simply point out that the responsibility for oversight, which the ICO will continue to have, is not changed in any way by the Bill. The Information Commissioner’s Office continues to regulate all organisations’ use of surveillance cameras, and it has issued its own video surveillance guidance.

New clause 17 would transfer the functions of the commissioner to the Investigatory Powers Commissioner. As I have already said, we believe that that would simply continue to result in oversight resting in two different places, and that is an unnecessary duplication. The Investigatory Powers Commissioner’s Office oversees activities that are substantially more intrusive than those relating to overt surveillance cameras. IPCO’s existing work requires it to oversee over 600 public authorities, as well as several powers from different pieces of legislation. That requires a high level of expertise and specialisation to ensure effective oversight.

For those reasons, we believe that the proposals in the clause to bring the oversight functions under the responsibility of the Information Commissioner’s Office will not result in any reduction in oversight, but will result in the removal of duplication and greater clarity. On that basis, I am afraid that I am unable to accept the amendment, and I hope that the hon. Lady will consider withdrawing it.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

I thank the Minister for responding to my amendments. However, we have a situation where we are going from having a specialist oversight to a somewhat more generalist oversight. That cannot be good when we are talking about this fast-moving technology. I will withdraw my amendment for the moment, but I reserve the right to bring it back at a later stage. I beg to ask leave to withdraw the amendment.

Amendment, by leave, withdrawn.

Clause 105 ordered to stand part of the Bill.

Clause 106

Oversight of biometrics databases

Stephanie Peacock Portrait Stephanie Peacock
- Hansard - - - Excerpts

I beg to move amendment 119, in clause 106, page 130, line 7, leave out

“which allows or confirms the unique identification of that individual”.

This amendment is intended to ensure that the definition of biometric data in the Bill includes cases where that data is used for the purposes of classification (and not just unique identification).

Data Protection and Digital Information Bill

Carol Monaghan Excerpts
John Whittingdale Portrait Sir John Whittingdale
- Hansard - - - Excerpts

As the hon. Gentleman knows, I strongly share his view about the need to act against abuse of legal procedures by the Russian state. As he will appreciate, this aspect of the Bill emanated from the Home Office. However, I have no doubt that my colleagues in the Home Office will have heard the perfectly valid point he makes. I hope that they will be able to provide him with further information about it, and I will draw the matter to their attention.

I wish to say just a few more words about the biometric material received from our international partners, as a tool in protecting the public from harm. Sometimes, counter-terrorism police receive biometrics from international partners with identifiable information. Under current laws, they are not allowed to retain these biometrics unless they were taken in the past three years. That can make it harder for our counter-terrorism police to carry out their job effectively. That is why we are making changes to allow the police to take proactive steps to pseudonymise biometric data received from international partners—obviously, that means holding the material without including information that identifies the person—and hold indefinitely under existing provisions in the Counter-Terrorism Act information that identifies the person it relates to. Again, those changes have been requested by counter-terrorism police and will support them to better protect the British public.

The national underground asset register, or NUAR, is a digital map that will improve both the efficiency and safety of underground works, by providing secure access to privately and publicly owned location data about the pipes and cables beneath our feet. This will underpin the Government’s priority to get the economy growing by expediting projects such as new roads, new houses and broadband roll-out—the hon. Gentleman and I also share a considerable interest in that.

The NUAR will bring together valuable data from more than 700 public and private sector organisations about the location of underground utilities assets. This will deliver £490 million per year of economic growth, through increased efficiency, reduced asset strikes and reduced disruptions for citizens and businesses. Once operational, the running of the register will be funded by those who benefit most. The Government’s amendments include powers to, through regulations, levy charges on apparatus owners and request relevant information. The introduction of reasonable charges payable by those who benefit from the service, rather than the taxpayer, will ensure that the NUAR is a sustainable service for the future. Other amendments will ensure that there is the ability to realise the full potential of this data for other high-value uses, while respecting the rights of asset owners.

Carol Monaghan Portrait Carol Monaghan (Glasgow North West) (SNP)
- Hansard - -

Is any consideration given to the fact that that information could be used by bad actors? If people are able to find out where particular cables or pipes are, they also have the ability to find weakness in the system, which could have implications for us all.

--- Later in debate ---
Patrick Grady Portrait Patrick Grady
- View Speech - Hansard - - - Excerpts

It is difficult to know where to start. The Minister described this as a Brexit opportunities Bill. Of course, Brexit was supposed to be about this place taking back control. It was to be the triumph of parliamentary sovereignty over faceless Brussels bureaucrats, the end of red tape and regulations, and the beginning of a glorious new era of freedom unencumbered by all those complicated European Union rules and requirements that did silly things like keeping people safe and protecting their human rights.

Yet here we are with 200 pages of new rules and regulations and a further 160 pages of amendments. This time last week, the amendment paper was 10 pages long; today it is 15 times that and there is barely any time for any kind of proper scrutiny. Is this what Brexit was for: to hand the Government yet more sweeping powers to regulate and legislate without any meaningful oversight in this place? To create additional burdens on businesses and public services, just for the sake of being different from the European Union? The answer to those questions is probably yes.

I will speak briefly to the SNP amendments, but I will also consider some of the most concerning Government propositions being shoehorned in at the last minute in the hope that no one will notice. How else are we supposed to treat Government new schedule 1? The Minister is trying to present it as benign, or even helpful, as if it had been the Government’s intention all along to grant the DWP powers to go snooping around in people’s bank accounts, but if it has been so long in coming, as he said, why is it being added to the Bill only now? Why was it not in the original draft, or even brought to Committee, where there could at least have been detailed scrutiny or the opportunity to table further amendments?

Of course there should be action to tackle benefit fraud—we all agree on that—but the DWP already has powers, under section 109B of the Social Security Administration Act 1992, to issue a notice to banks to share bank account information provided that they have reasonable grounds to believe that an identified, particular person has committed, or intends to commit, a benefit offence. In other words, where there is suspicion of fraud, the DWP can undertake checks on a claimant’s account. Incidentally, there should also be action to tackle tax evasion and tax fraud. The Government evidently do not require from the Bill any new powers in that area, so we can only assume that they are satisfied that they have all the powers they need and that everything possible is being done to ensure that everybody pays the tax that they owe.

The powers in new schedule 1 go much further than the powers that the DWP already has. By their own admission, the Government will allow the DWP to carry out—proactively, regularly, at scale and on a speculative basis—checks on the bank accounts and finances of claimants. The new schedule provides little in the way of safeguards or reassurances for people who may be subject to such checks. The Secretary of State said that

“only a minimum amount of data will be accessed and only in instances which show a potential risk of fraud and error”.

In that case, why is the power needed at all, given that the Government already have the power to investigate where there is suspicion of fraud? And how can only “a minimum amount” of data be accessed when the Government say in the same breath that they want to be able to carry out those checks proactively and at scale.

Carol Monaghan Portrait Carol Monaghan
- Hansard - -

My hon. Friend probably shares my concern that we are moving into a new era in which the bank account details of people claiming with the DWP must be shared as a matter of course. That is the only reason I can see for such sweeping amendments, which will impact on so many people.

Patrick Grady Portrait Patrick Grady
- Hansard - - - Excerpts

There is a huge risk. It is clear that the Government’s starting point is very often to avoid giving people the social security and welfare support that they might need to live a dignified life. We know that the approach in Scotland is incredibly different.

That is the thing: as with so much of this Bill, there is a good chance that minority groups or people with protected characteristics will find themselves most at risk of those checks and of coming under the proactive suspicion of the DWP. As we said when moving the committal motion, we have not had time to seek properly to interrogate that point. In his attempts to answer interventions, the Minister kind of demonstrated why scrutiny has been so inadequate. At the same time, the Government’s own Back Benchers, including the right hon. Member for Haltemprice and Howden (Mr Davis), the hon. Member for Yeovil (Mr Fysh) and others, are tabling quite thoughtful amendments—that is never a great sign for a Government. The Government should not be afraid of the kinds of safeguards and protections that they are proposing.

The SNP amendments look to remove the most dangerous and damaging aspects of the Bill—or, at the very least, to amend them slightly. Our new clause 44 and amendment 229 would have the effect of transferring the powers of the Surveillance Camera Commissioner to the Investigatory Powers Commissioner. That should not be all that controversial. Professor William Webster, a director of the Centre for Research into Information, Surveillance and Privacy, has warned that the Bill, as it stands, does not provide adequate mechanisms for the governance and oversight of surveillance cameras. The amendment would ensure that oversight is retained, the use of CCTV continues to be regulated, and public confidence in such technologies is strengthened, not eroded. CCTV is becoming more pervasive in the modern world—not least with the rise of video doorbells and similar devices that people can use in their own personal circumstances—so it is concerning that the Government are seeking to weaken rather than strengthen protections in that area.

The SNP’s amendment 222 would leave out clause 8, and our amendment 223 would leave out clause 10, removing the Government’s attempts to crack down on subject access requests. The effect of those clauses might, in the Government’s mind, remove red tape from businesses and other data-controlling organisations, but it would do so at the cost of individuals’ access to their own personal data. That is typified by the creation of a new and worryingly vague criterion of “vexatious or excessive” as grounds to refuse a subject access request. Although that might make life easier for data controllers, it will ultimately place restrictions on data subjects’ ability to access what is, we must remember, their data. There have been attempts—not just throughout Committee stage, but even today from the Opposition—to clarify exactly the thresholds for “vexatious and excessive” requests. The Government have been unable to answer, so those clauses should not be allowed to stand.

Amendment 224 also seeks to leave out clause 12, expressing the concerns of many stakeholders about the expansion in scope of automated decision making, alongside an erosion of existing protections against automated decision making. The Ada Lovelace Institute states that:

“Against an already-poor landscape of redress and accountability in cases of AI harms, the Bill’s changes will further erode the safeguards provided by underlying regulation.”

There is already significant and public concern about AI and its increasingly pervasive impact.

Clause 12 fails to offer adequate protections against automated decision making. An individual may grant consent for the processing of their data—indeed, they might have no choice but to do so—but that does not mean that they will fully understand or appreciates how that data will be processed or, importantly, how decisions will be made. At the very least, the Government should accept our amendment 225, which would require the controller to inform the data subject when an automated decision has been taken in relation to the data subject. I suspect, however, that that is unlikely—just as it is unlikely that the Government will accept Labour amendments 2 and 5, which we are happy to support—so I hope the House will have a chance to express its view on clause 12 as a whole later on.

The SNP’s amendments 226, 227 and 228 would have the effect of removing clauses 26, 27 and 28 respectively. Those clauses give the Home Secretary significant new powers to authorise the police to access personal data, and a power to issue a “national security” certificate telling the police that they do not need to comply with many important data protection laws and rules that they would otherwise have to obey, which would essentially give police immunity should they use personal data in a way that would otherwise be illegal—and they would no longer need to respond to requests under the Freedom of Information Act 2000. We have heard no explanation from the Government for why they think that the police should be allowed to break the law and operate under a cover of darkness.

The Bill will also expand what counts as an “intelligence service” for the purposes of data protection law. Again, that would be at the Home Secretary’s discretion, with a power to issue a designation notice allowing law enforcement bodies to take advantage of the more relaxed rules in the Data Protection Act 2018—otherwise designed for the intelligence agencies—whenever they are collaborating with the security services. The Government might argue that that creates a simplified legal framework, but in reality it will hand massive amounts of people’s personal information to the police, including the private communications of people in the UK and information about their health histories, political beliefs, religious beliefs and private lives.

Neither the amended approach to national security certificates nor the new designation notice regime would be reviewable by the courts, and given that there is no duty to report to Parliament, Parliament might never find out how and when the powers have been used. If the Home Secretary said that the police needed to use those increased powers in relation to national security, his word would be final. That includes the power to handle sensitive data in ways that would otherwise, under current legislation, be criminal.

The Home Secretary is responsible for both approving and reviewing designation notices. Only a person who is directly affected by such a notice will be able to challenge it, yet the Home Secretary would have the power to keep the notice secret, meaning that those affected would not even know about it and could not possibly challenge it. Those are expansive broadenings not just of the powers of the secretary of state, but of the police and security services. The Government have not offered any meaningful reassurance about how those powers will be applied or what oversight will exist, which is why our amendments propose scrapping those clauses entirely.

There remain other concerns about many aspects of the Bill. The British Medical Association and the National AIDS Trust have both raised questions about patients’ and workers’ right to privacy. The BMA calls the Bill

“a departure from the existing high standards of data protection for health data”.

We welcome the amendments to that area, particularly amendment 11, tabled by the hon. Member for Jarrow (Kate Osborne), which we will be happy to support should it be selected for a vote.

I am afraid that I have to echo the concerns expressed by the Labour Front-Bench spokesman, the hon. Member for Rhondda (Sir Chris Bryant), about new clause 45, which was tabled by the hon. Member for Aberconwy (Robin Millar). That clause perhaps has laudable aims, but it is the view of the Scottish National party that it is not for this place to legislate in that way, certainly not without consultation and ideally not without consent from the devolved authorities. We look forward to hearing the hon. Member for Aberconwy make his case, but I do not think we are in a position to support his new clause at this time.