Data Protection and Digital Information (No. 2) Bill Debate

Full Debate: Read Full Debate
Department: Department for Science, Innovation & Technology

Data Protection and Digital Information (No. 2) Bill

Rebecca Long Bailey Excerpts
2nd reading
Monday 17th April 2023

(1 year ago)

Commons Chamber
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I thank the hon. Gentleman for pressing me on that important point. I know that many businesses are seeking to maintain adequacy. If we want a business-friendly regime, we do not want to create regulatory disruption for businesses, particularly those that trade with Europe and want to ensure that there is a free flow of data. I can reassure him that we have been in constant contact with the European Commission about our proposals. We want to make sure that there are no surprises. We are currently adequate, and we believe that we will maintain adequacy following the enactment of the Bill.

Rebecca Long Bailey Portrait Rebecca Long Bailey (Salford and Eccles) (Lab)
- Hansard - -

I was concerned to hear from the British Medical Association that if the EU were to conclude that data protection legislation in the UK was inadequate, that would present a significant problem for organisations conducting medical research in the UK. Given that so many amazing medical researchers across the UK currently work in collaboration with EU counterparts, can the Minister assure the House that the Bill will not represent an inadequacy in comparison with EU legislation as it stands?

Julia Lopez Portrait Julia Lopez
- Hansard - - - Excerpts

I hope that my previous reply reassured the hon. Lady that we intend to maintain adequacy, and we do not consider that the Bill will present a risk in that regard. What we are trying to do, particularly in respect of medical research, is make it easier for scientists to innovate and conduct that research without constantly having to return for consent when it is apparent that consent has already been granted for particular medical data processing activities. We think that will help us to maintain our world-leading position as a scientific research powerhouse.

Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliance mechanisms that they already use, avoiding needless checks and costs. We are also delighted to be co-hosting, in partnership with the United States, the next workshop of the global cross-border privacy rules forum in London this week. The CBPR system is one of the few existing operational mechanisms that, by design, aims to facilitate data flows on a global scale.

World-class research requires world-class data, but right now many scientists are reluctant to get the data they need to get on with their research, for the simple reason that they do not know how research is defined. They can also be stopped in their tracks if they try to broaden their research or follow a new and potentially interesting avenue. When that happens, they can be required to go back and seek permission all over again, even though they have already gained that permission earlier to use personal data. We do not think that makes sense. The pandemic showed that we cannot risk delaying discoveries that could save lives. Nothing should be holding us back from curing cancer, tackling disease or producing new drugs and treatments. This Bill will simplify the legal requirements around research so that scientists can work to their strengths with legal clarity on what they can and cannot do.

The Bill will also ensure that people benefit from the results of research by unlocking the potential of transformative technologies. Taking artificial intelligence as an example, we have recently published our White Paper: “AI regulation: a pro-innovation approach”. In the meantime, the Bill will ensure that organisations know when they can use responsible automated decision making and that people know when they can request human intervention where those decisions impact their lives, whether that means getting a fair price for the insurance they receive after an accident or a fair chance of getting the job they have always wanted.

I spoke earlier about the currency of trust and how, by maintaining it through high data protection standards, we are likely to see more data sharing, not less. Fundamental to that trust will be confidence in the robustness of the regulator. We already have a world-leading independent regulator in the Information Commissioner’s Office, but the ICO needs to adapt to reflect the greater role that data now plays in our lives alongside its strategic importance to our economic competitiveness. The ICO was set up in the 1980s for a completely different world, and the pace, volume and power of the data we use today has changed dramatically since then.

It is only right that we give the regulator the tools it needs to keep pace and to keep our personal data safe while ensuring that, as an organisation, it remains accountable, flexible and fit for the modern world. The Bill will modernise the structure and objectives of the ICO. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also be asked to focus on how it can empower businesses and organisations to drive growth and innovation across the UK, and support public trust and confidence in the use of personal data.

The Bill is also important for consumers, helping them to share less data while getting more product. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking tools offered by innovative businesses, which help consumers and businesses to manage their finances and spending, track their carbon footprint and access credit.

--- Later in debate ---
Lucy Powell Portrait Lucy Powell (Manchester Central) (Lab/Co-op)
- View Speech - Hansard - - - Excerpts

It is good finally to get the data Bill that was promised so long ago. We nearly got there in the halcyon days of September 2022, under the last Prime Minister, after it had been promised by the Prime Minister before. However, the Minister has a strong record of bringing forward and delivering things that the Government have long promised. I also know that she has another special delivery coming soon, which I very much welcome and wish her all the best with. She took a lot of interventions and I commend her for all that bobbing up and down while so heavily pregnant. I would also like to send my best wishes to the Secretary of State, who let me know that she could not be here today. I would also like to wish her well with her imminent arrival. There is lots of delivery going on today.

We are in the midst of a digital and data revolution, with data increasingly being the most prized asset and fundamental to the digital age, but this Bill, for all its hype, fails to meet that moment. Even since the Bill first appeared on the Order Paper last September, AI chatbots have become mainstream, TikTok has been fined for data breaches and banned from Government devices, and AI image generators have fooled the world into thinking that the Pope had a special papal puffer coat. The world, the economy, public services and the way we live and communicate are changing fast. Despite these revolutions, this data Bill does not rise to the challenges. Instead, it tweaks around the edges of GDPR, making an already dense set of privacy rules even more complex.

The UK can be a global leader in the technologies of the future. We are a scientific superpower, we have some of the world’s best creative industries and now, outside the two big trading blocs, we could have the opportunities of nimbleness and being in the vanguard of world-leading regulation. In order to harness that potential, however, we need a Government who are on the pitch, setting the rules of the game and ensuring that the benefits of new advances are felt by all of us and not just by a handful of companies. The Prime Minister can tell us again how much he loves maths, but without taking the necessary steps to support the data and digital economy, his sums just do not add up.

The contents of this Bill might seem technical—as drafted, they are incredibly technical—but they matter greatly to every business, consumer, citizen and organisation. As such, data is a significant source of power and value. It shapes the relationship between business and consumers, between the state and citizens, and much, much more. Data information is critical to innovation and economic growth, to modern public services, to democratic accountability and to transforming societies, if harnessed and shaped in the interest of the many, not simply the few—pretty major, I would say.

Now we have left the EU, the UK has an opportunity to lead the world in this area. The next generation of world-leading regulation could allow small businesses and start-ups to compete with the monopolies in big tech, as we have already heard. It could foster a climate of open data, enable public services to use and share data for improved outcomes, and empower consumers and workers to have control over how their data is used. In the face of this huge challenge, the Bill is at best a missed opportunity, and at worst adds another complicated and uncertain layer of bureaucracy. Although we do not disagree with its aims, there are serious questions about whether the Bill will, in practice, achieve them.

Data reform and new regulation are welcome and long overdue. Now that we have left the EU, we need new legislation to ensure that we both keep pace with new developments and make the most of the opportunities. The Government listened to some of the concerns raised in response to the consultation and removed most of the controversial and damaging proposals. GDPR has been hard to follow for some businesses, especially small businesses and start-ups, so streamlining and simplifying data protection rules is a welcome aim. However, we will still need some of them to meet EU data adequacy rules.

The aim of shifting away from tick-box exercises towards a more proactive and systematic approach to regulation is also good. Better and easier data sharing between public services is essential, and some of the changes in that area are welcome, although we will need assurances that private companies will not benefit commercially from personal health data without people’s say so. Finally, nobody likes nuisance calls or constant cookie banners, and the moves to reduce or remove them are welcome, although there are questions about whether the Bill lives up to the rhetoric.

In many areas, however, the Bill threatens to take us backwards. First, it may threaten our ability to share data with the EU, which would be seriously bad for business. Given the astronomical cost to British businesses should data adequacy with the EU be lost, businesses and others are rightly looking for more reassurances that the Bill will not threaten these arrangements. The EU has already said that the vast expansion of the Secretary of State’s powers, among other things, may put the agreement in doubt. If this were to come to pass, the additional burdens on any business operating within the EU, even vaguely, would be enormous.

British businesses, especially small businesses, have faced crisis after crisis. Many only just survived through covid and are now facing rising energy bills that threaten to push them over the edge. According to the Information Commissioner,

“most organisations we spoke to had a plea for continuity.”

The Government must go further on this.

Secondly, the complex new requirements in this 300-page Bill threaten to add more hurdles, rather than streamlining the process. Businesses have serious concerns that, having finally got their head around GDPR, they will now have to comply with both GDPR and all the new regulations in this Bill. That is not cutting red tape, in my view.

Thirdly, the Bill undermines individual rights. Many of the areas in which the Bill moves away from GDPR threaten to reduce protection for citizens, making it harder to hold to account the big companies that process and sell our data. Subject access requests are being diluted, as the Government are handing more power to companies to refuse such requests on the grounds of being excessive or vexatious. They are tilting the rules in favour of the companies that are processing our data. Data protection impact assessments will no longer be needed, and protections against automated decision making are being weakened.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - -

AlgorithmWatch explains that automated decision making is “never neutral.” Outputs are determined by the quality of the data that is put into the system, whether that data is fair or biased. Machine learning will propagate and enhance those differences, and unfortunately it already has. Is my hon. Friend concerned that the Bill removes important GDPR safeguards that protect the public from algorithmic bias and discrimination and, worse, provides Henry VIII powers that will allow the Secretary of State to make sweeping regulations on whether meaningful human intervention is required at all in these systems?

Lucy Powell Portrait Lucy Powell
- Hansard - - - Excerpts

My hon. Friend makes two very good points, and I agree with her on both. I will address both points in my speech.

Taken together, these changes, alongside the Secretary of State’s sweeping new powers, will tip the balance away from individuals and workers towards companies, which will be able to collect far more data for many more purposes. For example, the Bill could have a huge impact on workers’ rights. There are ever more ways of tracking workers, from algorithmic management to recruitment by AI. People are even being line managed by AI, with holiday allocation, the assignment of roles and the determination of performance being decided by algorithm. This is most serious when a low rating triggers discipline or dismissal. Transparency and accountability are particularly important given the power imbalance between some employers and workers, but the Bill threatens to undermine them.

If a person does not even know that surveillance or algorithms are being used to determine their performance, they cannot challenge it. If their privacy is being infringed to monitor their work, that is a harm in itself. If a worker’s data is being monetised by their company, they might not even know about it, let alone see a cut. The Bill, in its current form, undermines workers’ ability to find out what data is held about them and how it is being used. The Government should look at this again.

The main problem, however, is not what is in the Bill but, rather, what is not. Although privacy is, of course, a key issue in data regulation, it is not the only issue. Seeing regulation only through the lens of privacy can obscure all the ways that data can be used and can impact on communities. In modern data processing, our data is not only used to make decisions about us individually but pooled together to analyse trends and predict behaviours across a whole population. Using huge amounts of data, companies can predict and influence our behaviour. From Netflix recommendations to recent examples of surge pricing in music and sports ticketing, to the monitoring of covid outbreaks, the true power of data is in how it can be analysed and deployed. This means the impact as well as the potential harms of data are felt well beyond the individual level.

Moreover, as we heard from my hon. Friend the Member for Salford and Eccles (Rebecca Long Bailey), the algorithms that analyse data often replicate and further entrench society’s biases. Facial recognition that is trained on mostly white faces will more likely misidentify a black face—something that I know the parliamentary channel sometimes struggles with. AI language bots produce results that reflect the biases and limitations of their creators and the data on which they are trained. This Bill does not take on any of these community and societal harms. Who is responsible when the different ways of collecting and using data harm certain groups or society as a whole?

As well as the harms, data analytics offers huge opportunities for public good, as we have heard. Opening up data can ensure that scientists, public services, small businesses and citizens can use data to improve all our lives. For example, Greater Manchester has, over the years, linked data across a multitude of public services to hugely improve our early years services, but this was done entirely locally and in the face of huge barriers. Making systems and platforms interoperable could ensure that consumers can switch services to find the best deal, and it could support smaller businesses to compete with existing giants.

Establishing infrastructure such as a national research cloud and data trusts could help small businesses and not-for-profit organisations access data and compete with the giants. Citymapper is a great example, as it used Transport for London’s open data to build a competitor to Google Maps in London. Open approaches to data will also provide better oversight of how companies use algorithms, and of the impact on the rest of us.

Finally, where are the measures to boost public trust? After the debacle of the exam algorithms and the mishandling of GP data, which led millions of people to withdraw their consent, and with workers feeling the brunt but none of the benefits of surveillance and performance management, we are facing a crisis in public trust. Rather than increasing control over and participation in how our data is used, the Bill is removing even the narrow privacy-based protections we already have. In all those regards, it is a huge missed opportunity.

To conclude, with algorithms increasingly making important decisions about how we live and work, data protection has become ever more important to ensure that people have knowledge, control, confidence and trust in how and why data is being used. A data Bill is needed, but we need one that looks towards the future and harnesses the potential of data to grow our economy and improve our lives. Instead, this piecemeal Bill tinkers around the edges, weakens our existing data protection regime and could put our EU adequacy agreement at risk. We look forward to addressing some of those serious shortcomings in Committee.

Data Protection and Digital Information (No. 2) Bill (Second sitting) Debate

Full Debate: Read Full Debate

Rebecca Long Bailey

Main Page: Rebecca Long Bailey (Labour - Salford and Eccles)

Data Protection and Digital Information (No. 2) Bill (Second sitting)

Rebecca Long Bailey Excerpts
Committee stage
Wednesday 10th May 2023

(11 months, 3 weeks ago)

Public Bill Committees
Read Full debate Data Protection and Digital Information Bill 2022-23 Read Hansard Text Amendment Paper: Public Bill Committee Amendments as at 10 May 2023 - (10 May 2023)
Damian Collins Portrait Damian Collins
- Hansard - - - Excerpts

Q I have a final question. We have this legislation, and then different tech companies and operating systems have separate guidelines that they work to as well. One of the issues the Government faced with, for instance, the covid vaccine app, was that it had to comply with the operating rules for Google and iOS, regardless of what the Government wanted it to do. Thinking of the work that your organisation has been involved in, are there still significant restrictions that go beyond the legal thresholds because different operating systems set different requirements?

Jonathan Sellors: I do not think I am really the best qualified person to talk about the different Android and Apple operating systems, although we did a lot of covid-related work during the pandemic, which we were not restricted from doing.

Tom Schumacher: I would say that this comes up quite a lot for Medtronic in the broader medtech industry. I would say a couple of things. First, this is an implementation issue more than a Bill issue, but the harmonisation of technical standards is absolutely critical. One of the challenges that we, and I am sure NHS trusts, experience is variability in technical and IT security standards. One of the real opportunities to streamline is to harmonise those standards, so that each trust does not have to decide for itself which international standard to use and which local standard to use.

I would also say that there is a lot of work globally to try to reach international standards, and the more that there can be consistency in standards, the less bureaucracy there will be and the better the protection will be, particularly for medical device companies. We need to build those standards into our product portfolio and design requirements and have them approved by notified bodies, so it is important that the UK does not create a new and different set of standards but participates in setting great international standards.

Rebecca Long Bailey Portrait Rebecca Long Bailey (Salford and Eccles) (Lab)
- Hansard - -

Q In relation to medical research, concerns have been raised that the Bill might risk a divergence from current EU adequacy and that that might have quite a significant detrimental impact on collaboration, which often happens across the EU on medical research. Are you concerned about that, and what should the Government do to mitigate it?

Jonathan Sellors: I think that it is absolutely right to be concerned about whether there will be issues with adequacy, but my evaluation, and all the analysis that I have read from third parties, particularly some third-party lawyers, suggests that the Bill does not or should not have any impact on the adequacy decision at all—broadly because it takes the sensible approach of taking the existing GDPR and then making incremental explanations of what certain things actually mean. There are various provisions of GDPR—for example, on genetic data and pseudonymisation—that are there in just one sentence. It is quite a complicated topic, so having clarification is thoroughly useful, and I do not think that that should have any impact on the adequacy side of it. I think it is a very important point.

Tom Schumacher: I agree that it is a critical point. I also feel as though the real value here is in clarifying what is already permitted in the European GDPR but doing it in a way that preserves adequacy, streamlines and makes it easier for all stakeholders to reach a quick and accurate decision. I think that adequacy will be critical. I just do not think that the language of the text today impacts the ability of it to be adequate.

Chi Onwurah Portrait Chi Onwurah (Newcastle upon Tyne Central) (Lab)
- Hansard - - - Excerpts

Q I know that you are very supportive of the Bill, but I wonder whether you see risks to patients and service users from facilitating a greater sharing of health and care data. Could you each answer that question?

Jonathan Sellors: I think that data sharing, of one sort or another, absolutely underpins medical research. You need to be able to do it internationally as well; it is not purely a UK-centric activity. The key is in making sure that the data that you are using is properly de-identified, so that research can be conducted on patients, participants and resources in a way that does not then link back to their health data and other data.

--- Later in debate ---
Chi Onwurah Portrait Chi Onwurah
- Hansard - - - Excerpts

Has the balance between sharing and the regulation of biometric data, particularly facial recognition data, been struck in the right way?

Helen Hitching: I do not think facial recognition data is captured.

Aimee Reed: On facial recognition, given that we have deployed it—very high profile—I think that the balance is right. We have learned a lot from the South Wales judgment and from our own technical deployments. The Bill will also highlight how other biometric data should be managed, creating parity and an environment where biometric data that we do not yet have access to or use of is future-proofed in the legislation. That is really welcome.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - -

Q Helen, you mentioned that you are broadly supportive of the abolition of the Biometrics Commissioner and the Surveillance Camera Commissioner, but that that abolition will not reduce the existing level of oversight. Now seems to be the time to request additional resources if you did not feel that the new commissioners would be adequately resourced, so do you have confidence that the Investigatory Powers Commissioner has sufficient resources and expertise to take on the functions it has to? Similarly, does the Information Commissioner have sufficient resources and expertise to oversee regulation in this area?

Helen Hitching: It is difficult for the agency to comment on another organisation’s resources and capabilities. That question should probably be posed directly to them. The Information Commissioner’s Office already deploys resources on issues related to law enforcement data processing, including the publication of guidance. From a biometrics perspective, the casework is moving to the IPC, so from a resourcing perspective I think it would have adequate casework provision and expertise.

Aimee Reed: I echo the comments about expertise, particularly of the Investigatory Powers Commissioner. I think that the expertise exists but, like Helen, whether it has enough resources to cope with the casework I presume is a demand assessment that it will do in response to the Bill.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - -

Q I have a final question for you, Aimee. There are concerns, particularly given that the Information Commissioner’s Office 2021 data protection audit report gave an assurance rating of “limited” to the Met’s policies on records management. How can you reassure the public, given that there will be such an expansion of powers in the area, that the Met will not receive a similar report over the next 12 months?

Aimee Reed: That is a very topical question today. The first thing to say is that I am not sure I agree that this is a large expansion of our access to personal data; I think it is a simplification of the understanding of what we can do as a law enforcement body. All the same safeguards and all the same clear water will be in place between the different parts of the Act.

We did indeed get a “limited” rating on records management, but as I am sure you are aware, we were assessed on three areas, and we got the second highest grading in the other two: the governance and accountability of our management data; and our information risk management. They came out higher.

What have we done since 2021? We have done quite a lot to improve the physical and digital records management, with greater focus on understanding what data we hold and whether we should still hold it, starting a review, retain and deletion regime. We now have an information asset register and a ROPA—record of processing activities. The previous commissioner, Cressida Dick, invested a significant amount in data management and a data office, the first in UK policing. The new commissioner, as I am sure you have seen, is very committed to putting data at the heart of his mission, too. We have already done quite a lot.

The Bill will simplify how we are able to talk to the public about what we are doing with their data, while also reassuring them about how we use it. We are in a very different place from where we were 12 months ago; in another 12 months, it will be even more significantly improved. We have just worked with the Open Data Institute to improve how open we will be with our data to the public and partners in future, giving more to enable them to hold us to account. I am already confident that we would not get a rating like that again in records management, just based on the year’s review we have had from the ICO about where we have got to.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - -

Q Similarly, now that you have authority over all forces across the UK, I have the same question regarding each of them: are you content that they are equipped and resourced adequately to meet data protection requirements, given that there is such an expansion?

Aimee Reed: I wish I had authority across them. I represent—that is a better way of describing what I do. Am I confident that law enforcement in general has the right investment in this space, across all forces? No, I am not. That is what I am working hard to build with Chief Constable Jo Farrell, who leads in this area for all forces on the DDaT approach. Am I more confident that forces really getting investment in this space is necessary? Absolutely.

Rebecca Long Bailey Portrait Rebecca Long Bailey
- Hansard - -

Q In terms of additional resources, are there any specific figures or requirements that you could point the Government towards at this stage?

Aimee Reed: In line with our own DDaT framework, we are working with the Home Office and other ministerial bodies on what good looks like and how much is enough. I am not sure that anybody has the answer to that question yet, but we are certainly working on it with the Home Office.

None Portrait The Chair
- Hansard -

Ladies, thank you very much indeed for your time this afternoon. We will let you get back to your crime fighting.

Examination of Witnesses

Andrew Pakes and Mary Towers gave evidence.