Data Protection and Digital Information Bill

Lord Knight of Weymouth Excerpts
Lord Knight of Weymouth Portrait Lord Knight of Weymouth (Lab)
- View Speech - Hansard - -

My Lords, I start with apologies from my noble friend Lady Jones of Whitchurch, who cannot be with us due to illness. We wish her a speedy recovery in time for Christmas. I have therefore been drafted in temporarily to open for the Opposition, shunting my noble friend Lord Bassam to close for us at the end of the debate. As a result, what your Lordships will now get with this speech is based partly on his early drafts and partly on my own thoughts on this debate—two for the price of one. I reassure your Lordships that, while I am flattered to be in the super-sub role, I look forward to returning to the Back Benches for the remaining stages in the new year.

I remind the House of my technology interests, particularly in chairing the boards of CENTURY Tech and EDUCATE Ventures Research—both companies working with AI in education. I very much welcome the noble Lord, Lord de Clifford, to his place and look forward to his maiden speech.

Just over six years ago, I spoke at the Second Reading of the Data Protection Bill. I said then that:

“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.


For me, that remains the vision. We are grateful to the Minister for setting out in his speech his vision, but it feels to me that one of the Bill’s failings is the weakening of the protection from exploitation that would follow if it passes in its current form. In that 2017 Second Reading speech, I also said that:

“No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead”.—[Official Report, 10/10/17; cols. 183-5.]


Now that we have moved squarely into the age of AI, I welcome the opportunity to update GDPR to properly regulate data capture, storage and sharing in the public interest.

In the Online Safety Act, we strengthened Ofcom to regulate technology providers and their algorithmic impacts. In the Digital Markets, Competition and Consumers Bill, we are strengthening the Competition and Markets Authority to better regulate these powerful acquisitive commercial interests. This Bill is the opportunity to strengthen the Information Commissioner to better regulate the use of data in AI and some of the other potential impacts discussed at the recent AI summit.

This is where the Bill is most disappointing. As the Ada Lovelace Institute tells us in its excellent briefing, the Bill does not provide any new oversight of cutting-edge AI developments, such as biometric technologies or foundation models, despite well-documented gaps in existing legal frameworks. Will the Minister be coming forward with anything in Committee to address these gaps?

While we welcome the change from an Information Commissioner to a broader information commission, the Bill further weakens the already limited legal safeguards that currently exist to protect individuals from AI systems that make automated decisions about them in ways that could lead to discrimination or disadvantage—another lost opportunity.

I co-chair the All-Party Parliamentary Group on the Future of Work, and will be seeking to amend the Bill in respect of automated decision-making in the workplace. The rollout of ChatGPT-4 now makes it much easier for employers to quickly and easily develop algorithmic tools to manage staff, from hiring through to firing. We may also want to provide safeguards over public sector use of automated decision-making tools. The latter is of particular concern when reading the legal opinion of Stephen Cragg KC on the Bill. He says that:

“A list of ‘legitimate interests’ (mostly concerning law and order, safeguarding and national security) has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned … The Secretary of State can add to this list without the need for primary legislation, bypassing important Parliamentary controls”.


Furthermore, on lost opportunities, the Bill does not empower regulators with the tools or capabilities that they need to implement the Government’s plans for AI regulation or the commitments made at the AI Safety Summit. In this, I personally support the introduction of a duty on all public regulators to have regard to the principles on AI that were published in the Government’s White Paper. Would the Minister be willing to work with me on that?

There are other lost opportunities. I have argued elsewhere that data trusts are an opportunity to build public trust in their data being used to both develop better technology and generate revenue back to the taxpayer. I remain interested in whether personal data could be defined as an asset that can be bequeathed in one’s estate to avoid what we discussed in our debates on what is now the Online Safety Act, where bereaved families have had a terrible experience trying to access the content their children saw online that contributed to their deaths—and not just from suicide.

This takes me neatly on to broken promises and lessons not learned. I am confident that, whether the Government like it or not, the House will use this Bill to keep the promises made to families by the Secretary of State in respect of coroners being able to access data from technology providers in the full set of scenarios that we discussed, not just self-harm and suicide. It is also vital that the Bill does nothing to contradict or otherwise undermine the steps that this country has taken to keep children safe in the digital world. I am sure we will hear from the noble Baroness, Lady Kidron, on this subject, but let me say at this stage that we support her and, on these Benches, we are fully committed to the age-appropriate design code. The Minister must surely know that in this House, you take on the noble Baroness on these issues at your peril.

I am also confident that we will use this Bill to deliver an effective regime on data access for researchers. During the final parliamentary stages of the Online Safety Bill, the responsible Ministers, Paul Scully MP and the noble Lord, Lord Parkinson, recognised the importance of going further on data access and committed in both Houses to exploring this issue and reporting back on the scope to implement it through other legislation, such as this Bill. We must do that.

The Bill has lost opportunities and broken promises, but in other areas it is also failing. The Bill is too long—probably like my speech. I know that one should not rush to judgment, but the more I read the Bill and various interpretations of its impact, the more I worry about it. That has not been helped by the tabling of some 260 government amendments, amounting to around 150 pages of text, on Report in another place—that is, after the Bill had already undergone its line-by-line scrutiny by MPs. Businesses need to be able to understand this new regime. If they also have any data relationship with the EU, they potentially also need to understand how this regime interacts with the EU’s GDPR. On that, will the Minister agree to share quickly with your Lordships’ House his assessment of whether the Bill meets the adequacy requirements of the EU? We hear noises to the contrary from the Commission, and it is vital that we have the chance to assess this major risk.

After the last-minute changes in another place, the Bill increasingly seems designed to meet the Government’s own interests: first, through changes to rules on direct marketing during elections, but also by giving Ministers extensive access to the bank account data of benefit claimants and pensioners without spelling out the precise limitations or protections that go alongside those powers. I note the comments of the Information Commissioner himself in his updated briefing on the Bill:

“While I agree that the measure is a legitimate aim for government, given the level of fraud and overpayment cited, I have not yet seen sufficient evidence that the measure is proportionate ... I am therefore unable, at this point, to provide my assurance to Parliament that this is a proportionate approach”.


In starting the scrutiny of these provisions, it would be useful if the Minister could confirm in which other countries such provisions already exist. What consultation have they been subject to? Does HMRC already have these powers? If not, why go after benefit fraud but not tax fraud?

Given the lack of detailed scrutiny this can ever have in the other place, I of course assume the Government will respect whatever is the will of this House when we have debated these measures.

As we did during last week’s debate on the Digital Markets, Competition and Consumers Bill, I will now briefly outline a number of other areas where we will be seeking changes or greater clarity from the Government. We need to see a clear definition of high-risk processing in the Bill. While the Government might not like subject access requests after recent experience of them, they have not made a convincing case for significantly weakening data-subject rights. Although we support the idea of smart data initiatives such as extending the successful open banking framework to other industries, we need more information on how Ministers envisage this happening in practice. We need to ensure the Government’s proposals with regards to nuisance calls are workable and that telecommunications companies are clear about their responsibilities. With parts of GDPR, particularly those on the use of cookies, having caused so much public frustration, the Bill needs to ensure appropriate consultation on and scrutiny of future changes in this area. We must take the public with us.

So a new data protection Bill is needed, but perhaps not this one. We need greater flexibility to move with a rapidly changing technological landscape while ensuring the retention of appropriate safeguards and protections for individuals and their data. Data is key to future economic growth, and that is why it will be a core component of our industrial strategy. However, data is not just for growth. There will be a clear benefit in making data work for the wider social good and the empowerment of working people. There is also, as we have so often discussed during Oral Questions, huge potential for data to revitalise the public services, which are, after 13 years of this Government, on their knees.

This Bill seems to me to have been drafted before the thinking that went into the AI summit. It is already out of date, given its very slow progress through Parliament. There is plenty in the Bill that we can work with. We are all agreed there are enormous opportunities for the economy, our public services and our people. We should do everything we can to take these opportunities forward. I know the Minister is genuinely interested in collaborating with colleagues to that end. We stand ready to help the Government make the improvements that are needed, but I hope the Minister will acknowledge that there is a long way to go if this legislation is to have public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy too. We must end the confusion, empower the regulators and in turn empower Parliament to better scrutinise the tsunami of digital secondary legislation coming at us. There is much to do.