(2 weeks, 5 days ago)
Commons ChamberData is the new gold. It is by using good data, and lots of the stuff—heaps—that we will cure diseases, empower consumers and businesses, find solutions to societal problems and unleash economic growth. Behind that data, however, are the lives of everyday people, and the decisions made with that data will impact everyday lives. We must ensure that data and the value from it is used in the service of the British people. That is why the Liberal Democrats welcome this Bill’s efforts to modernise and clarify our data laws and to unleash growth and opportunity.
The digital landscape is evolving rapidly, and it is right that we seek to keep pace. The Bill marks an improvement on the previous data Bill introduced under the Conservative Government. However, although this Bill contains some positive steps, it also contains significant gaps and missed opportunities. We must seize this opportunity to get the legislation right, and to ensure that the data landscape we put in place serves all of us across the UK.
Maintaining public trust in data safeguards is vital. As the Ada Lovelace Institute emphasises, trust about data and technology is a must. In order for our democratic principles to be upheld, citizens must be able to trust how their data is being used. That is even more important as the data-driven digital interfaces of government increase.
Trust is also two sides of the same coin: inclusion and adoption. One is critical for society and the other is crucial for growth. That is why, in the name of constructive opposition and the interest of firming up public trust, we would like to highlight concerns and missed opportunities. As the Bill passes through the House, we will be seeking to interrogate and strengthen it where necessary.
One of our primary concerns lies with the powers granted to the Secretary of State, particularly on recognised legitimate interests and the framework for digital verification. In essence, the Bill allows ministerial decisions that bypass meaningful parliamentary scrutiny. That risks a situation where changes to how data is captured or shared are made unilaterally, without the thorough checks and balances that Parliament or the public expect.
Both the Delegated Powers and Regulatory Reform Committee and the Constitution Committee have highlighted these issues. The Open Rights Group highlights concerns that a governing party could change the rules on election data, for example, or have undue influence on the Information Commissioner’s decision-making process and jeopardise impartiality. Although there is a drive in the Bill to formalise digital identity frameworks, the Liberal Democrats believe it is crucial to strike the right balance. We support harnessing digital verification to make services more efficient, provided there is robust transparency and independent oversight of how personal data is stored and used. We urge Ministers to tighten this area with clear ethical safeguards to genuinely foster trust rather than undermine it.
Secondly, modernisation should not come at the cost of transparency. Several clauses appear to dilute individuals’ rights to information about how their data is collected and processed, notably in respect of legitimate interests and automated decision making. Clause 77, for example, risks seriously watering down the rights of data subjects, and in doing so seriously hampers public trust in data processing. The National Data Guardian and the British Medical Association, for example, are worried about the clause eroding transparency in how health and social care data is used for research. If we truly wish to harness the benefits of emerging technology, from AI to digital verification, we must earn and maintain that trust, and that depends on being open about how data is used and by whom.
That brings me on to the Bill’s proposals on automated decision making. They currently focus on special category data, which leaves our ordinary personal data less shielded. AI and algorithmic processes increasingly determine people’s credit, insurance, and even job prospects. There are risks in restricting enhanced safeguards to only certain categories of information without further amendments to protect individuals.
The hon. Lady is absolutely right about that tendency, but it does not have to be like that. We can either build a society that is about personal interactions and familiarity, or we can allow a society of the kind she describes to develop, which will destroy the tapestry of those interactions that make up the wellbeing of each of us and all of us.
There is definitely a lot of opportunity in automated decision making, but the safeguards must be in place to make sure that human decisions and the right to safeguards around the impact of those decisions are upheld, because restricting enhanced safeguards to only certain categories of information, without further amendments, could exclude a wide range of significant decisions from meaningful human review and create a lack of transparency. Again, doing so undermines public trust and hinders the adoption of AI and emergent technologies.
We share the concerns of organisations including Justice and the Open Rights Group that clause 80 weakens safeguards by broadening the scope for automated decisions. Although the clause makes safeguarding requirements more explicit, there are concerns that it also provides the Secretary of State with considerable powers via secondary legislation to amend or set aside those safeguards. The Liberal Democrats are firm in our conviction that where a person is the subject of automated decision making, there simply must be a right to explanation, a right to appeal and a meaningful human intervention.