Data (Use and Access) Bill [Lords]

Ben Spencer Excerpts
Thursday 22nd May 2025

(1 week, 3 days ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I thank the right hon. Gentleman for his point of order, which was not in fact a point of order. He will be aware that the programme motion has already been agreed to by the House.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

As hon. Members know, the substance of this Bill began with the previous Government, in recognition of the need to streamline and harness the use of data to grow the economy and drive improvement in the delivery of public services. As I have said before, when the Bill started its life, most of us had no idea that it would become the vehicle for addressing some of the most important social and technological issues of our time.

Although I welcome the huge benefits that the Bill will bring to the economy and public services when it comes into force, I fear that it will go down on the Government’s record as the Bill of missed opportunities. It is a missed opportunity to fix our flawed public data sets, which present a barrier to tracking and tackling inequalities in areas such as women’s health; a missed opportunity to commit to a review of protections for children in their use of social media platforms, and to taking action to increase those protections where the evidence shows there is good reason to do so; and a missed opportunity to provide much-needed certainty to two of our key growth industries, the creative and AI sectors, on how they can interact to promote their mutual growth and flourishing.

It could be seen as somewhat dispiriting to be back at the Dispatch Box again, having the debate on copyright and AI with the Department’s ministerial team, but I see that there has been an upgrade since our last outing at the Dispatch Box. I pay tribute to the Secretary of State for his tone and his approach to this debate, particularly his recognition of previous mistakes made. As politicians, we do not say sorry often enough, or recognise mistakes or where we would have wanted things to go better. I appreciate the statements he has made from the Dispatch Box, but the fact that we are here is testament to the determination and sincere concern of Members of both Houses. Whatever Benches they sit on, they are deeply concerned that we must not miss this opportunity to find a solution to such a significant challenge.

Our colleagues in the other place have spoken about their commitment to the primacy of this House, and their reticence to delay the passage of this Bill any further than is absolutely necessary. Their resolve demonstrates the importance of this issue to Members of both Houses and the stakeholders they represent. The Government have spoken repeatedly of their commitment to protecting the creative industries, but their actions are still yet to match their rhetoric. It appears that “reviews” have today been upgraded to “working groups.”

Many excuses have been made for why the Government feel unable to act now. Baroness Kidron and other noble Lords have acted in good faith on the Government’s stated concerns, and have sought to address them in the latest iteration of their transparency amendment on copyright and AI. Lords amendment 49D would provide the Government with flexibility to put in place proportionate regulations on the transparency of AI enterprises by reference to their size. Importantly, it would allow a reasonable timeframe for the Government to complete their review of responses to their consultation, which concluded in February, before the Secretary of State is compelled to lay draft transparency regulations before Parliament.

For the third time, an amendment on this topic received the overwhelming support of Members in the other place, and the debate at the last round showed that the strength of feeling is mirrored in this House. Amendment 49D is a balanced clause that would put in place a much-needed long-stop date to provide the certainty that creatives and the technology industries alike have been calling for. As the hon. Member for East Thanet (Ms Billington) suggested, it is a backstop.

The Government have run out of excuses for failing to act. Today we have an opportunity to achieve something relatively rare in our political climate: creating effective, balanced legislation based on cross-party compromise. It is important to public confidence in Government to show that we can put sound principles above politics when the overwhelming need arises. The Government have another opportunity today; let us make sure that it is not another missed one.

Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I call the Liberal Democrat spokesperson.

Data (Use and Access) Bill [Lords]

Ben Spencer Excerpts
Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

It is a pleasure to rise to speak on Lords amendments to the Data (Use and Access) Bill. Over the course of debating the Bill, it has become customary to thank those in the other place for the work they have done, particularly Baroness Owen for her work on deepfakes and others who have campaigned boldly in that area.

I will begin by speaking to Lords amendment 49B. We have been clear that supporting the creative and AI sectors is not a zero-sum game; we need to support both sectors. Through their ham-fisted consultation on copyright and AI, the Government have raised great concern throughout the creative sector, and the resulting attempts to amend this Bill have been in response to the mess they have created. In Committee and on Report, we set out a series of amendments that focused on the outcome—not the process—for a solution in this area. Those amendments focused on ensuring that the position in law of copyright in this area was clear, on the need for proportionate and effective transparency, on removing barriers to start-ups, and on facilitating technological solutions via digital watermarking.

In one of the many interventions on the Minister, my right hon. Friend the Member for North East Cambridgeshire (Steve Barclay) mentioned the importance of implementing digital watermarking. He referred to it as a response to deepfakes, but it also has relevance to technical solutions, and it strikes me as quite odd that the Minister went on to cover broadly the same topics in his opening remarks, despite pointing out to my right hon. Friend that those topics were not relevant to the ongoing debate. That indicates how confused the treatment of this area in the Bill has become, and the need for clarity.

I pay tribute to Viscount Camrose, Lord Parkinson, my hon. Friend the Member for Gosport (Dame Caroline Dinenage), my right hon. Friend the Member for Maldon (Sir John Whittingdale), Baroness Kidron, and others in this House and in the other place, for their work on amendments to reach a resolution in this area. We had sympathy with earlier versions of those amendments, but also concerns about their workability and prescriptiveness. We have worked with Baroness Kidron to get to a position that we can now support; we believe that solutions need to incorporate the principles of transparency and proportionality. The amendment is not a perfect solution, but it is more reasonable than doing nothing.

I find it astounding that the main criticism that the Minister has made of Lords amendment 49B is that it has a run-in period prior to implementation and that people are calling for things to happen now. That is an odd way of approaching legislating. As the Opposition, we are working with other parties, among others, to try to find a solution to get the Minister out of a hole. I hope that Members across the House support the amendment.

Moving on to digital verification services, I welcome the Lords’ disagreement with amendments 32 and 52, and support their amendments 32B, 32C, 52B and 52C on sex data accuracy, which received the support of Members in the other place. As my noble Friend Viscount Camrose said in his speech, it was necessary to re-table amended versions of the clauses on data accuracy previously secured in the other place because our new clause 21 was not in scope for debate in the Lords. The Lords amendments are technical and complex, so if you will forgive me, Madam Deputy Speaker, I will speak briefly to new clause 21 to explain for the benefit of Members how things have evolved over time.

Our new clause 21 would have compelled public authorities to correct the datasets they hold in relation to sex and to collect data on the protected characteristic of sex in accordance with the legal definition set out in the Supreme Court’s judgment: biological sex. It would also have allowed public authorities to collect data on acquired sex as recorded on a gender recognition certificate where that is relevant and lawful. It would have imposed no new obligations on the correction of data held by public authorities—the obligation already exists under article 5(1)(d) of the UK General Data Protection Regulation—but would simply have put in place a timescale for correcting data on sex. We know from the findings of the Sullivan review that that correction is much needed and long overdue.

To address a misconception, new clause 21 was silent on how sex is recorded in physical and digital forms of identity for those holding a gender recognition certificate. That is a sensitive issue for the 8,500 holders of GRCs in the UK, and we hope that much-needed clarity in this area will be given by the Equality and Human Rights Commission in its guidance due to be laid before Parliament next month. It will be up to the Secretary of State to make rules as to how that guidance is implemented in digital verification services. However, that issue, while important, does not affect the clear obligation that already exists in law to record data on sex accurately.

Lords amendments 32C and 32B, and disagreement with amendment 32, would compel the Secretary of State to examine whether the public authorities that will act as data sources for the digital verification services system ascertain sex data reliably in accordance with biological sex and, where lawful and relevant, with sex as recorded on a gender recognition certificate. That would prevent inaccurate sex data from being entrenched and proliferated in the digital verification services system. Lords amendments 52B and 52C, and disagreement with amendment 52, would give the Secretary of State the power to define in a data dictionary sex data as biological sex and, where relevant, sex as recorded on a gender recognition certificate. That could then be applied across the digital verification services system, the register of births and deaths, and other circumstances where public authorities record personal data. The amendments are critical for correcting our compromised datasets on sex and would ensure that poor-quality and inaccurate data does not undermine digital verification services.

To be clear, if our amendments do not make it into the Bill, self-ID will be brought forward through the back door, risking the protections that single-sex spaces offer to everyone. Self-ID is not and never has been the position in UK law. I do not understand why the Government are resisting these measures. Digital verification systems need to be trustworthy to deliver the benefits intended by the Bill. If they are not trustworthy, the system will fail. I therefore commend these vital and much-needed amendments to the House.

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- View Speech - Hansard - - - Excerpts

Let me join others in expressing my gratitude for the work of many Members, especially in the other place—in particular, Baroness Owen and Baroness Kidron—but also across this House. There has been a great deal of cross-party work, including much constructive discussion on many elements of the Bill with the Minister. Today, though, I will refer specifically to Lords amendment 49B.

I am lucky enough to represent a part of Hertfordshire that is woven into British creativity, from Graham Greene of Berkhamsted, whose masterpiece “Brighton Rock” shaped our cultural consciousness, to Eric Morecambe of Harpenden, whose partnership in Morecambe and Wise brought joy to millions, while the music of the Devines from Berkhamsted gets us up and dancing, and local artists such as Mary Casserley and Andrew Keenleyside paint our daily lives in ways that bring perspective, colour and joy in a way that only artists can achieve. Our landscapes in Ashridge and Aldbury have inspired film-makers from Disney to the producers of the Harry Potter films, and our pubs have been featured in films including “Bridget Jones”.

Today, this creative legacy faces an unprecedented threat. The current situation is more than just alarming; it is threatening the essence of our national identity and our creative economy. We hear concerns about resources for protecting our creative sector, but those arguments miss a crucial point: our creative industries, combined, contribute £126 billion to our economy, employ 2.4 million people, and are growing significantly faster than the wider economy. The question is not whether we can afford to protect these industries, but whether we can afford not to. When we invest in enforcing copyright protections, we are also investing in safeguarding one of Britain’s greatest economic assets and our competitive advantage on the world stage.

The transparency provisions in Lords amendment 49B are essential and proportionate. They apply proportionately to businesses of different sizes, while ensuring that our creative powerhouse can continue to thrive and, indeed, work hand in hand with technology. True leadership in AI means building on respect for creativity, not exploitation. Let me make it clear that this is not about resisting technology, but about recognising value and safeguarding innovation—and that brings me back home to Berkhamsted.

In the heart of my constituency sits the British Film Institute National Archive, one of the largest and more significant film collections in the world, comprising over 275,000 titles and 20,000 silent films dating back to 1894. It is a living memory of our national story, told on screen. Would we allow anyone to walk into the BFI and take whatever they liked? Would we let them scan, copy and republish those works without permission or compensation? Of course not. So I ask the Minister, why would we allow the same thing to happen in the digital world?

This is a defining moment. We can build an AI-powered future that respects and rewards creativity, or we can allow short-term interests to strip-mine the work of generations. The question before us today is simple: will we stand for a future when technology and creativity flourish together, or will we allow the foundations of our cultural life and economic prosperity to be hollowed out for short-term gain? I urge the Government to stand up for our creators, stand up for transparency, and stand up for the principle that, in the age of AI, human creativity still matters.

Oral Answers to Questions

Ben Spencer Excerpts
Wednesday 14th May 2025

(2 weeks, 4 days ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

Accurate data is important, particularly in the public sector—we will be voting on this later today. How will the Secretary of State measure his planned productivity improvements? How will he define success, and over what time period?

Peter Kyle Portrait Peter Kyle
- View Speech - Hansard - - - Excerpts

I can assure the hon. Gentleman that we are deploying technology to deliver productivity gains across Whitehall, which are starting now. We are investing heavily through the digital centre that we created in the Department for Science, Innovation and Technology and working intensively with Departments such as the Department for Work and Pensions and His Majesty’s Revenue and Customs. We have already identified billions of pounds-worth of savings, which will be put to use within Government without delay for the benefit of citizens.

Data (Use and Access) Bill [Lords]

Ben Spencer Excerpts
Wednesday 7th May 2025

(3 weeks, 4 days ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Nusrat Ghani Portrait Madam Deputy Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

It is a privilege to respond to this debate on behalf of His Majesty’s official Opposition, and to speak to the new clauses and amendments. This is an ambitious piece of legislation, which will enable us to harness data—the currency of our digital age—and use it in a way that drives the economy and enhances the delivery of public services. Since its original inception under the Conservatives in the last Parliament, the Bill has also become the platform for tackling some of the most pressing social and technological issues of our time. Many of these are reflected in the amendments to the Bill, which are the subject of debate today.

I start with new clause 20. How do we regulate the interaction of AI models with creative works? I pay tribute to the work of many Members on both sides of this House, and Members of the other place, who have passionately raised creatives’ concerns and the risks posed to their livelihoods by AI models. Conservative Members are clear that this is not a zero-sum game. Our fantastic creative and tech industries have the potential to turbocharge economic growth, and the last Government rightly supported them. The creative and technology sectors need and deserve certainty, which provides the foundation for investment and growth. New clause 20 would achieve certainty by requiring the Government to publish a series of plans on the transparency of AI models’ use of copyrighted works, removing market barriers for smaller AI market entrants and digital watermarking and, most important of all, a clear restatement of the application of copyright law to AI-modelling activities.

I cannot help but have a sense of déjà vu in relation to Government new clause 17: we are glad that the Government have acted on several of the actions we called for in Committee, but once again they have chosen PR over effective policy. Amid all the spin, the Government have in effect announced a plan to respond to their own consultation—how innovative!

What is starkly missing from the Government new clauses is a commitment to make it clear that copyright law applies to the use of creative content by AI models, which is the primary concern raised with me by industry representatives. The Government have created uncertainty about the application of copyright law to AI modelling through their ham-fisted consultation. So I offer the Minister another opportunity: will he formally confirm the application of copyright law to protect the use of creative works by AI, and will he provide legal certainty and send a strong signal to our creative industries that they will not be asked to pay the price for AI growth?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the Minister for making that statement at the Dispatch Box. As he knows, we need to have that formally, in writing, as a statement from the Government to make it absolutely clear, given that the consultation has muddied the waters.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am sorry, but I said that in my speech, and I have said it several times in several debates previously.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I would therefore be grateful if the Minister said why there remains uncertainty among creatives about the application of copyright in this area. Is that not why we need to move this forward?

I now turn to Government amendment 34 and others. I congratulate my noble Friend Baroness Owen on the tremendous work she has done in ensuring that clauses criminalising the creation of and request for sexually explicit deepfake images have made it into the Bill. I also thank the Government for the constructive approach they are now taking in this area.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I should have said earlier that, as the shadow Minister knows, in Committee we changed the clause on “soliciting” to one on “requesting” such an image, because in certain circumstances soliciting may require the exchange of money. That is why we now have the requesting offence.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the Minister for his clarification and reiteration of that point, and again for his work with colleagues to take forward the issue, on which I think we are in unison across the House.

New clause 21 is on directions to public authorities on recording of sex data. One does not need to be a doctor to know that data accuracy is critical, particularly when it comes to health, research or the provision of tailored services based on protected characteristics such as sex or age. The accuracy of data must be at the heart of this Bill, and nowhere has this been more high-profile or important than in the debate over the collection and use of sex and gender data. I thank the charity Sex Matters and the noble Lords Arbuthnot and Lucas for the work they have done to highlight the need for accurate data and its relevance for the digital verification system proposed in the Bill.

Samantha Niblett Portrait Samantha Niblett (South Derbyshire) (Lab)
- Hansard - - - Excerpts

The recent decision by the Supreme Court that “sex” in the Equality Act 2010 refers to biological sex at birth, regardless of whether someone holds a gender recognition certificate or identifies as of a different gender, has already left many trans people feeling hurt and unseen. Does the shadow Minister agree with me that any ID and digital verification service must consider trans people, not risk making them more likely to feel that their country is forgetting who they are?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the hon. Member for her intervention, and I will shortly come on to the impact on all people of the decision of the Supreme Court. Our new clause’s focus and scope are simple. The Supreme Court ruling made it clear that public bodies must collect data on biological sex to comply with their duties under the Equality Act. The new clause ensures that this data is recorded and used correctly in accordance with the law. This is about data accuracy, not ideology.

New clause 21 is based in part on the work of Professor Alice Sullivan, who conducted a very important review, with deeply concerning findings on inaccurate data collection and the conflation of gender identity with biological sex data. She found people missed off health screening, risks to research integrity, inaccurate policing records and management through the criminal justice system, and many other concerns. These concerns present risks to everyone, irrespective of biological sex, gender identity or acquired gender. Trans people, like everyone else, need health screening based on their biological sex. Trans people need protecting from sexual predators, too, and they have the right to dignity and respect.

The Sullivan report shows beyond doubt that the concerns of the last Government and the current Leader of the Opposition were entirely justified. The Government have had Professor Sullivan’s report since September last year, but the Department for Science, Innovation and Technology has still not made a formal statement about it or addressed the concerns raised, which is even more surprising given its relevance to this Bill. The correction of public authority data on sex is necessary and urgent, but it is made even more critical by the implementation of the digital verification services in the Bill.

Tonia Antoniazzi Portrait Tonia Antoniazzi (Gower) (Lab)
- Hansard - - - Excerpts

I appreciate that the shadow Minister is making an important point on the Sullivan review and the Supreme Court judgment, but there are conversations in Government and with Labour Members to ensure that the Supreme Court judgment and the Sullivan review are implemented properly across all Departments, and I hope to work with the Government on that.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the hon. Member for her intervention, and for all the work that she and colleagues on both sides of the House are doing in this area. I hope that the findings of the Sullivan report are implemented as soon as possible, and part of that implementation would be made possible if Members across the House supported our new clause.

For the digital verification services to be brought in, it is important that the data used to inform them is accurate and correct. Digital verification could be used to access single-sex services, so it needs to be correct, and if sex and gender data are conflated, as we know they are in many datasets, a failure to act will bring in self-ID by the back door. To be clear, that has never been the legal position in the UK, and it would conflict with the ruling of the Supreme Court. Our new clause 21 is simple and straightforward. It is about the accurate collection and use of sex data, and rules to ensure that data is of the right standard when used in digital verification services so that single-sex services are not undermined.

New clause 19 is on the Secretary of State’s duty to review the age of consent for data processing under the UK GDPR. What can or should children be permitted to consent to when using or signing up to online platforms and social media? How do we ensure children are protected, and how do we prevent harms from the use of inappropriate social media itself, separate from the content provided? How do we help our children in a world where social media can import the school, the playground, the changing room, the influencer, the stranger, the groomer, the radical and the hostile state actor all into the family home?

Our children are the first generation growing up in the digital world, and they are exposed to information and weaponised algorithms on a scale that simply did not exist for their parents. In government, we took measures to improve protections and regulate harmful content online, and I am delighted to see those measures now coming into force. However, there is increasing evidence that exposure to inappropriate social media platforms is causing harm, and children as young as 13 may not be able to regulate and process this exposure to such sites in a safe and proportionate way.

I am sure every Member across the House will have been contacted by parents concerned about the impact of social media on their children, and we recognise that this is a challenging area to regulate. How do we define and target risky and inappropriate social media platforms, and ensure that education and health tech—or, indeed, closed direct messaging services—do not fall within scope? How effective are our provisions already, and can age verification be made to work for under-16s? What ids are available to use? What will the impact of the Online Safety Act 2023 be now that it is coming into force? What are the lessons from its implementation, and where does it need strengthening? Finally, how do we support parents and teachers in educating and guiding children so they are prepared to enter the digital world at whatever age they choose and are able to do so?

The Government must take action to ensure appropriate safeguards are in place for our children, not through outright bans or blanket restrictions but with an evidence-based approach that takes into account the recent legal changes and need for effective enforcement, including age verification for under-16s. Too often in this place we focus on making more things illegal rather than on the reasons for lack of enforcement in the first place. There is no point in immediate restrictions if they cannot be implemented.

Munira Wilson Portrait Munira Wilson (Twickenham) (LD)
- Hansard - - - Excerpts

I agree with all the points the shadow Minister is making about keeping our children safe online, so why does new clause 19 only commit to a review of the digital age of data consent and raising the age from 13 to 16 for when parental consent is no longer required? Why does he not support the Liberal Democrats’ new clause 1 that would start to implement this change? We can still, through implementation, do all the things the hon. Gentleman proposes to do, so why the delay?

Ben Spencer Portrait Dr Spencer
- Hansard - -

There are a few issues with new clause 1. One is the scope in terms of the definition of networking services and ensuring platforms such as WhatsApp are not captured within it. Looking at new clause 19, there are challenges to implementing in this area. There is no point in clicking our fingers and saying, “Let’s change the age of digital consent,” without understanding the barriers to implementation, and without understanding whether age verification can work in this context. We do not want to create a system and have people just get around it quite simply. We need the Government to do the work in terms of setting it up so that we can move towards a position of raising the age from 13 to 16.

Max Wilkinson Portrait Max Wilkinson (Cheltenham) (LD)
- Hansard - - - Excerpts

The press have obviously been briefed by Conservatives that the Conservatives are pushing for a ban on social media for under-16s, but it seems that what is actually being suggested is a review of the digital age of consent with a view to perhaps increasing it to 16. The two positions are very different, and I wonder whether the tough talk in the press matches what is actually being proposed by the Opposition today.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I have been very clear on this, and it is important in such a complex area to look at the detail and nuance of the challenges around—(Interruption.) Well, it is very easy to create a new clause where we click our fingers and say, “Let’s make this more illegal; let’s bring in x, y or z restriction.” As a responsible Opposition, we are looking at the detail and complexities around implementing something like this. [Interruption.] I have been asked a few questions and the hon. Member for Cheltenham (Max Wilkinson) might want to listen to the rationale of our approach.

One question is how to define social media. Direct messaging services such as WhatsApp and platforms such as YouTube fall in the scope of social media. There are obviously social media platforms that I think all of us are particularly concerned about, including Snapchat and TikTok, but by changing the age of digital consent we do not want to end up capturing lower-risk social media platforms that we recognise are clearly necessary or beneficial, such as education technology or health technology platforms. And that is before we start looking at whether age verification can work, particularly in the 13-to-16 age group.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Sorry, I am getting a bit lost. Does the Minister think, and does the Conservative party think, that the digital age of consent should rise from 13 to 16 or not?

Ben Spencer Portrait Dr Spencer
- Hansard - -

rose—

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

Order. I point out to Mr Bryant that Dr Ben Spencer is the shadow Minister.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I think that was wishful thinking by the Minister in this debate.

Our new clause says that we need to look at the desirability of raising the digital age of consent for data processing from 13 to 16 in terms of its impact particularly on issues such as the social and educational development of children, but also the viability of doing so in terms of the fallout and the shaking out of the Online Safety Act and with regard to age verification services. Should there then be no evidence to demonstrate that it is unnecessary, we would then raise the digital age of consent to 13 to 16. It might be the case that, over the next six months, the shaking out of the Online Safety Act demonstrates that this intervention is not necessary. Perhaps concerns around particular high-risk social media platforms will change as technology evolves. We are saying that the Government should do the work with a view to raising the age in 18 months unless there is evidence to prove the contrary. [Interruption.] I have made this crystal clear, and if the Minister would choose to look at the new clause, rather than chuckling away in the corner, he might see the strategy we are proposing.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I thank the shadow Minister for giving way. As ever, he is extremely polite in his presentation and in his dealing with interventions, but I am not sure that he dealt with my intervention, which was basically asking whether the Conservative party position is as it has briefed to the press—that it wishes to ban social media for under-16s—or that it wishes to have a review on raising the age of data consent. It cannot be both.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I say again that the position is that, following a careful look at the evidence regarding the desirability and validity of doing so—taking into account findings regarding the impact and implementation of the Online Safety Act and age verification and how one defines social media, particularly high-risk platforms—unless there is direct evidence to show that raising the age from 13 to 16 is unnecessary, which there may be, then we should raise it from 13 to 16. If that has not provided clarity, the hon. Gentleman is very welcome to intervene on me again and I will try and explain it a third time, but I think Members have got a grasp now.

This new clause will also tackle some of the concerns at the heart of the campaign for Jools’ law, and I pay tribute to Ellen Roome for her work in this area. I am very sympathetic to the tragic circumstances leading to this campaign and welcome the additional powers granted to coroners in the Bill, but I know that they do not fully address Ellen Roome’s concerns. The Government need to explain how they can be sure that data will be retained in the context of these tragedies, so that a coroner will be able to make sure, even if there are delays, that it can be accessed. If the Minister could provide an answer to that in his winding-up speech, and detail any further work in the area, that would be welcome.

On parental access to children’s data more broadly, there are difficult challenges in terms of article 8 rights on privacy and transparency, especially for children aged 16 to 17 as they approach adulthood. Our new clause addresses some of these concerns and would also put in place the groundwork to, de facto, raise the digital age of consent for inappropriate social media to 16 within 18 months, rendering the request for parental access to young teenage accounts obsolete.

I urge colleagues across the House to support all our amendments today as a balanced, proportionate and effective response to a generational challenge. The Bill and the votes today are an opportunity for our Parliament, often referred to as the conscience of our country, to make clear our position on some of the most pressing social and technological issues of our time.

Nusrat Ghani Portrait Madam Deputy Speaker (Ms Nusrat Ghani)
- Hansard - - - Excerpts

I call the Chair of the Science, Innovation and Technology Committee.

--- Later in debate ---
Caroline Nokes Portrait Madam Deputy Speaker (Caroline Nokes)
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Spencer
- View Speech - Hansard - -

It has been a pleasure to hear the speeches of Members from across the House. I pay tribute to my hon. Friend the Member for Gosport (Dame Caroline Dinenage) and my right hon. Friend the Member for Maldon (Sir John Whittingdale), who spoke with passion about the protection of copyright in AI. I suspect that my right hon. Friend is looking forward to seeing the back of the Bill, and hoping that it does not return in a future iteration. My right hon. Friend the Member for Chingford and Woodford Green (Sir Iain Duncan Smith) spoke of the importance of ensuring that data does not fall victim to hostile states and hostile state actors. My right hon. Friend the Member for East Hampshire (Damian Hinds) spoke with knowledge and authority about this important issue, and the challenges and practicalities involved in ensuring that we get it right for our children.

I will return to the three themes that we have put forward. The Minister has repeatedly given assurances on the application of copyright with regard to AI training, but the Secretary of State created uncertainty by saying in the AI copyright consultation:

“At present, the application of UK copyright law to the training of AI models is disputed.”

When we create that level of uncertainty, we need at least an equal level of clarity to make amends, and that is partly what our new clause 20 calls for: among other things, a formal statement from the Intellectual Property Office or otherwise. I do not see why it is a challenge for the Government to put that forward and deliver.

--- Later in debate ---
Victoria Collins Portrait Victoria Collins
- Hansard - - - Excerpts

I would just like to clarify that we have thought long and hard about this Bill, along with many organisations and charities, to get it right.

Ben Spencer Portrait Dr Spencer
- Hansard - -

That is good to hear.

Max Wilkinson Portrait Max Wilkinson
- Hansard - - - Excerpts

I will try a third time, because we tried earlier. The Conservatives have clearly briefed the press that they are angling for a ban on social media for under-16s—it has been reported in multiple places. Can the shadow Minister confirm whether that is the Conservatives’ position or not?

Ben Spencer Portrait Dr Spencer
- Hansard - -

For the fourth time, and as I have said, new clause 19 would effectively create a de facto position whereby there are restrictions on the use of inappropriate social media services by children. It seeks to tackle the challenges of implementation, age verification and the scope of social media. It says that there needs to be work to make sure that we can actually do so and that, when we can, we should move in that direction, unless there is overwhelming evidence that it is not needed, such as with the shaking out of the Online Safety Act.

Finally, I return to new clause 21. Sadly, it has been widely misrepresented. The laws in this area are clear: the Equality Act puts in place obligations in relation to protected characteristics. The Supreme Court says that “sex” means biological sex, and that public authorities must collect data on protected characteristics to meet their duties under the Equality Act. The new clause would put that clear legal obligation into effect, and build in data minimisation principles to preserve privacy. There would be no outing of trans people through the new clause, but where public authorities collect and use sex data, it would need to be biological sex data.

Chris Bryant Portrait Chris Bryant
- View Speech - Hansard - - - Excerpts

As ever, it is good to see you in the Chair, Madam Deputy Speaker. I thank all right hon. and hon. Members who have taken part in the debate. If I do not manage to get to any of the individual issues that have been raised, and to which people want answers, I am afraid that is because of a shortness of time, and I will seek to write to them. I thank the officials who helped to put the Bill together, particularly Simon Weakley—not least because he not only did this Bill, but all the previous versions in the previous Parliament. He deserves a long-service medal, if not something more important.

I will start with the issues around new clauses 1, 11, 12 and 13, and amendment 9. The Government completely share the concern about the vulnerability of young people online, which lots of Members have referred to. However, the age of 13 was set in the Data Protection Act 2018—I remember, because I was a Member at the time. It reflects what was considered at the time to be the right balance between enabling young people to participate online and ensuring that their data is protected. Some change to protecting children online is already in train. As of last month, Ofcom finalised the child safety codes, a key pillar of the Online Safety Act. Guidance published at the same time started a three-month period during which all in-scope services likely to be accessed by children will be required to assess the risk of harm their services pose to them.

From July, the Act will require platforms to implement measures to protect children from harm, and this is the point at which we expect child users to see a tangible, positive difference to their online experiences. I wish it had been possible for all this to happen earlier— I wish the Act had been in a different year—but it is the Act it is. The new provisions include highly effective age checks to prevent children encountering the most harmful content, and adjusting algorithms to reduce the exposure to harmful content. Services will face tough enforcement from Ofcom if they fail to comply.

The Act very much sets the foundation for protecting children online. The Government continue to consider further options in pursuit of protecting children online, which is why the Department for Science, Innovation and Technology commissioned a feasibility study to understand how best to investigate the impact of smartphones and social media on children’s wellbeing. This will form an important part of our evidence base.

Intellectual Property: Artificial Intelligence

Ben Spencer Excerpts
Wednesday 23rd April 2025

(1 month, 1 week ago)

Westminster Hall
Read Full debate Read Hansard Text Read Debate Ministerial Extracts

Westminster Hall is an alternative Chamber for MPs to hold debates, named after the adjoining Westminster Hall.

Each debate is chaired by an MP from the Panel of Chairs, rather than the Speaker or Deputy Speaker. A Government Minister will give the final speech, and no votes may be called on the debate topic.

This information is provided by Parallel Parliament and does not comprise part of the offical record

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Ms McVey, and to respond on behalf of His Majesty’s Opposition in this very well-attended, knowledgeable and thoughtful debate. Given that so many Members have taken part, I can only make some brief remarks.

I want to focus on principles, which came up quite a few times throughout this debate. In a complex area, it is principles that help us get through. It seems to be tradition in this debate to say happy birthday to the hon. Member for Bury North (Mr Frith)—he will be clipping this so that it can go out on his social media. He spoke with knowledge and passion, and there is not much to disagree with in what he said. He also mentioned what I see as the core principles—transparency; the ability to enforce copyright; the ability to demonstrate where data comes from, so that we can see who owns it and what the root trace is; and a technological solution linked to that, in terms of demonstrating data ownership.

I also mention my hon. Friend the Member for Gosport (Dame Caroline Dinenage), the Chair of the Select Committee on Culture, Media and Sport, who reiterated this week, importantly, that all companies need property rights to be enforced, and that these two are not mutually exclusive. I thank her for her extensive work in this area to push forward this debate.

The previous Government were committed to the UK being at the cutting edge of tech and creative industries, and we remain committed to that in Opposition. We have heard the concerns of the creative industries loud and clear, but we do not believe that there is anything to be gained by treating the emergence of AI as some sort of zero-sum game, where one industry wins and another fails. It should not be an either/or. This needs to be mutually inclusive, not mutually exclusive, and we believe that it is possible to achieve that.

This is a challenging and complex area to get right. Solving this problem is not simple, particularly if we look at what is happening internationally and at extra-jurisdictional issues. Quite simply, other areas have not fixed this either. If there was a straightforward solution for this problem, it would be in process right now. It is important to recognise that from the outset, and to recognise the challenge facing the Minister in fixing the problem, but I have ambition for him. I believe that he can fix it, and I look forward to him doing so over the course of the next year. It is in this direction that we as Opposition want to take things forward.

We believe that getting this area of policy right will mean focusing on some key principles. Most importantly, there should be proportionate transparency in our AI industries about how they use creative content to train their models and generate content. That should be combined with recognition and enforceability of creative rights. The development of technology in the form of a readily accessible digital watermark will be instrumental in helping creatives protect their work online. Start-ups and small and medium-sized enterprises in our growing AI industries need to be supported to develop their models in a way that respects the rights of creatives. In that regard, the AI opportunities action plan identified the need to unlock public and private datasets to enable innovation and attract international talent and capital.

We tabled a series of pragmatic amendments to the Data (Use and Access) Bill in Committee that would have committed the Secretary of State to putting in place a plan to achieve those important aims within a reasonable period after the conclusion of the Government’s consultation on copyright and AI. We understand that the Government have received in excess of 11,500 consultation responses from stakeholders, which they are in the process of analysing. Given the concern that their original plans caused in our creative industries, we welcome the Minister’s announcement, following the closure of the consultation, that the Government have taken a second look at their preferred approach to regulating the sector. In particular, we welcome the renewed emphasis on the need for increased transparency about how models are trained, so that creatives can enforce their rights. This is a key area that has come up throughout the debate, and we called on the Government to set out an informed plan in Committee on the data Bill.

We appreciate that the impact of AI on intellectual property requires proper and careful consideration. We will work constructively to support the creation of policy and plans in this fundamentally important area. If we get it right, there will be tremendous economic and societal benefits to growing our AI sector and supporting our creative sector to continue to thrive. It is time for the Government to be clear about their plans, in order to create certainty for the AI and creative industries about the way forward and help promote an environment of confidence, paving the way for investment and growth.

Oral Answers to Questions

Ben Spencer Excerpts
Wednesday 26th March 2025

(2 months ago)

Commons Chamber
Read Full debate Read Hansard Text Watch Debate Read Debate Ministerial Extracts
Lindsay Hoyle Portrait Mr Speaker
- Hansard - - - Excerpts

I call the shadow Minister.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

Tragically, flaws in data and its processing are posing grave risks to women and girls. The Sullivan report exposed serious failings in the collection and use of biological sex data, which is often being replaced with gender identity. The report highlighted the risk that poses to the safe delivery of health services, policing and security. The Health Secretary has already shown leadership on this issue, but to date the Secretary of State has remained silent. When did he first have sight of the Sullivan report, and when does he intend to act on it?

BioNTech UK: Financial Assistance

Ben Spencer Excerpts
Monday 24th March 2025

(2 months, 1 week ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Ms Jardine.

With businesses warning that they are cutting investment because of the actions of this Government, the Confederation of British Industry warning in January that investment is at its lowest level since 2009 outside of the pandemic, and AstraZeneca recently pulling £450 million of investment because of the actions of this Government, it is reassuring to see today’s motion. Of course, this investment was secured at the global investment summit under the Conservatives, which makes sense.

We, of course, welcome BioNTech’s investment, and we welcome that the Government have continued our support. In the interest of scrutiny, can the Minister outline the negotiations with BioNTech on the subsidy, and whether BioNTech raised concerns about the poor investment environment that this Government have created, including with the increase in employer national insurance contributions?

Draft Electronic Communications (Networks and Services) (Designated Vendor Directions) (Penalties) Order 2025

Ben Spencer Excerpts
Wednesday 19th March 2025

(2 months, 1 week ago)

General Committees
Read Full debate Read Hansard Text Read Debate Ministerial Extracts
Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Sir Jeremy.

The draft Electronic Communications (Network and Services) (Designated Vendor Directions) (Penalties) Order 2025 provides for the calculation of a penalty relating to a designated vendor direction. A designated vendor direction is a power provided for by section 105Z1 of the Communications Act 2003, as inserted through the Telecommunications (Security) Act 2021. That power is intended to ensure that our critical telecoms networks are secure and protected from foreign state interference. We support the measures being taken forward today through this technical statutory instrument.

In 2022, a designated vendor direction was sent to 35 telecommunications companies to ban the installation of Huawei kit from new 5G installations; remove it from the network core by the end of 2023; remove it from 5G networks entirely by the end of 2027; limit it to 35% of the full-fibre access network by the end of October 2023; and remove it from sites significant to national security by 28 January 2023. Will the Minister update the Committee as to the progress on each of the four latter criteria for each of the 35 providers that received the notice?

I understand that BT did not meet some of those statutory deadlines. Does the Minister expect it to be fined and, if so, when and how much? Does he expect other companies to be enforced against? What work is he doing to ensure that Huawei kit is being removed at pace to meet the 2027 deadline? Can he update us on that? Does he intend to review the 35% threshold in relation to full-fibre access? Given the current geopolitical environment, what assessments has he made of other providers in our telecoms supply chain, and can he update us on current providers of interest?

Huawei kit is not limited to telecoms infrastructure. Can the Minister update us as to the Government’s position on Huawei and its security risks? Were our concerns regarding Huawei raised during the Government’s recent engagement with China, including with regard to the domestic import of high-tech Chinese-made consumer goods such as electric vehicles? Finally, what assessment has the Minister made of the risks that emerging new technologies, including the large language model DeepSeek, which is based in China, may pose to domestic and commercial users? Do the Government intend to provide guidance on that?

Protection of Children (Digital Safety and Data Protection) Bill

Ben Spencer Excerpts
Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- View Speech - Hansard - -

It is a pleasure to speak on behalf of His Majesty’s official Opposition on this important topic. Thank you for calling me now, Madam Deputy Speaker; I will not speak for too long, because there are so many Government Members who are keen to take part in this important debate.

I congratulate the hon. Member for Whitehaven and Workington (Josh MacAlister) on bringing this private Member’s Bill to the House and all the different people, companies, charities and organisations that he has worked with as part of the campaign to drive it forwards. I do feel for him, as I know what it is like for a Back Bencher in the party of Government to navigate the challenges of trying to use a private Member’s Bill—I have had two ballot Bills myself—as a campaign vehicle to drive change. I therefore hope that he will forgive me for some of the comments I shall make about the content of this Bill.

We can only deal with the text of the Bill before us, which was only published in the past few days, so my comments will necessarily reflect the detail of the hon. Member’s proposals. In a sense, this debates reminds me of Schrödinger’s cat, in that Members have made equally reference to a former Bill and a current Bill in their speeches. It feels like we are having a debate on a Bill that could have been and a debate on the Bill in front of us. Indeed, the hon. Member’s speech spoke to the lack of a need for further research, but equally the Bill calls for further research to take place.

We have heard some fantastic contributions from all parts of the House. I would like to note the contributions from my right hon. Friends the Members for North West Hampshire (Kit Malthouse) and for East Hampshire (Damian Hinds), and my hon. Friends the Members for Reigate (Rebecca Paul), for Bridgwater (Sir Ashley Fox) and for Broxbourne (Lewis Cocking), who reiterated the importance of driving forward change. Many stories and personal declarations have been shared about the impact of social media and the difficulty of parenting at this time. I should declare that I am also a parent, although thankfully my children are not at the stage when they have started consuming social media in the way that I have heard others talk about today.

Tom Hayes Portrait Tom Hayes (Bournemouth East) (Lab)
- Hansard - - - Excerpts

The hon. Member is making a powerful point about the debate and how it has dwelled in part on the importance of evidence and research. Does he agree that the reason we have so much evidence is that, as MPs, we speak with so many children? I have in my hand letters from children in years 5 and 6 at St James’ Church of England primary academy, and if I may quote briefly from Eleanor and River, they say

“kids will also be exposed to inappropriate content such as unsuitable videos and pictures. They could feel unsafe, discouraged or exposed, and then they would not be able to unsee the images again.”

Does the hon. Member agree that we should be shielding children from that sort of content?

Ben Spencer Portrait Dr Spencer
- Hansard - -

I am going to resist the temptation to be drawn into a discussion about research methodology in this area, although I have to tell the hon. Gentleman that I am very tempted to talk about the importance of case series data and qualitative data in terms of what people are telling us and what we are seeing ourselves, compared with cross-sectional or longitudinal studies or cluster studies looking at the impact of different schools. What I will say is that the stories of what children are being exposed to that we have heard in this debate and that we have all heard from our constituents are horrific—I do not think anyone would disagree with that. Clearly, we need to protect children from that.

At the heart of the Bill is the desire to drive forward our scientific understanding of the effects of smartphone and social media use on children’s mental health, learning and social development. I hope we hear a commitment from the Minister that the Government will conduct a detailed review in this complex area where so much is at stake, but I would also expand it further. Any analysis must take a clear-eyed approach to both the advantages and disadvantages offered by technological developments such as smartphones and internet access, looking at both the benefits to young people of increased connectivity and access to information, and even apps that help to manage health conditions such as diabetes at school and away from home, which will transform the lives of children and young people, and the increasing body of research that demonstrates the damaging effects of excessive smartphone and social media use on children and adolescents.

Catherine Fookes Portrait Catherine Fookes (Monmouthshire) (Lab)
- Hansard - - - Excerpts

Does the hon. Gentleman agree with me on the importance of moving towards smartphone-free schools? I welcome the work being done at Monmouth comprehensive in my constituency, where the headteacher is pushing forward a smartphone ban, because grades increase by almost two levels where schools have banned mobile phones.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I will come on to that a little later in my speech. Perhaps Government Members will have the chance to express their view on this matter on Report of the Children’s Wellbeing and Schools Bill in a couple of weeks’ time.

Turning back to the research, a longitudinal study in the US of more than 6,500 children aged between 12 and 15, adjusted for baseline mental health status, found that adolescents who spent more than three hours a day on social media faced double the risk of experiencing poor mental health outcomes, including symptoms of depression and anxiety. These findings have been brought into sharp focus by recent tragic cases of children taking their own lives after being the subject of online bullying or encountering harmful material online. Clearly, that weighs on all of us as legislators.

In government and now in opposition, the Conservatives have pursued measures to make the online world a safer place for children and young people. I am proud that the previous Government passed the Online Safety Act, among other measures, to make the online world safer. The Act requires platforms to take measures to prevent children from accessing harmful and age-inappropriate content, particularly relating to pornography, suicide and self-harm, serious violence and bullying. The Act further requires platforms to remove illegal content quickly and prevent it from appearing in the first place, and to use and enforce age-checking measures on platforms where content harmful to children is published through the adoption of highly effective age assurance technologies. In January, Ofcom published industry guidance on how it expects age assurance to be put into effect, including deadlines for platforms to conduct risk assessments and put certain safety measures in place. We can expect to see further developments in this area as the protections envisaged by the OSA are rolled out.

However, parents, including many in my constituency, are rightly concerned about the addictive nature of smartphones themselves and the impact on attention span and social development. According to polling by Parentkind in 2024, 83% of parents felt that smartphones are harmful to children and young people, while research carried out by Policy Exchange across more than 200 schools at the end of 2023 found that schools with strict mobile phone bans achieved, on average, better Progress 8 scores and better GCSE grades, despite the fact that the schools with highly effective bans had a higher proportion of pupils eligible for free school meals than the schools with less restrictive policies.

In February, the shadow Secretary of State for Education, my right hon. Friend the Member for Sevenoaks (Laura Trott), tabled an amendment to the Children’s Wellbeing and Schools Bill to require all schools in England to ban the use of mobile telephones during the school day. That, however, was rejected by the Government. I wonder how Labour Members feel about that. Should the Government decide to do so, perhaps further to the chief medical officers’ review, Opposition Members will work constructively with them to seek practical and effective solutions that enable children to continue to benefit from the opportunities offered online, while protecting them from those harmful effects.

Data (Use and Access) Bill [ Lords ] (First sitting)

Ben Spencer Excerpts
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Strictly speaking, it is a misnomer to say that we do the Bill line by line; we do it clause by clause, or grouping by grouping. The first grouping contains clause 1 and new clause 15, which was tabled by the Liberal Democrat spokesperson, the hon. Member for Harpenden and Berkhamsted.

Clauses 1 to 26 establish regulation-making powers to implement smart data schemes. I think this part of the Bill is universally accepted, or it was in a previous version of the Bill—this is at least the third version of the Bill that a House of Commons Committee has considered line by line, clause by clause or grouping by grouping. These clauses were part 3 of the old Bill, but it is none the less important that we go through each of the clauses segment by segment, because this is a newly constituted House of Commons, with different Members and political parties, and therefore we have to consider them fully.

As many hon. Members will know, smart data involves traders securely sharing data with the customer or authorised third parties at the customer’s request. Those third parties may use the data to provide the customer with innovative services, including account management services or price comparisons. This has already been spectacularly successful in open banking.

Clause 1 defines the key terms and scope of part 1, which covers clauses 1 to 26. Subsection (2) defines the kinds of data to which part 1 applies: “customer data”, which is information specific to a customer of a trader, and “business data”, which is generic data relating to the goods, services or digital content provided by that trader. It also defines “data holder” and “trader” to clarify who may be required to provide data. That covers persons providing the goods, services or digital content, whether they are doing so themselves or through others, or processing related data.

Subsections (3) to (5) set out who is a customer of a trader. Customers can include both consumers and businesses such as companies. Subsection (6) recognises that regulations may provide for data access rather than transfer.

I commend clause 1 to the Committee and urge hon. Members to resist the temptations offered by the hon. Member for Harpenden and Berkhamsted, who tabled new clause 15. I thank her for her interest in smart data. We had a very good conversation a week ago. I am glad to be able to confirm that, following some pressure from the Liberal Democrats in the other place, the Government announced that the Department for Business and Trade intends to publish a strategy document later this year on future uses of those powers. Since the hon. Member’s new clause asks for a road map and we are saying that there will be a strategy, the difference between us may just be semantic.

The strategy document will lay out the Government’s plans to consult or conduct calls for evidence in a number of sectors. It is important that we implement those powers only after having properly spoken with relevant parties such as consumer groups and industry bodies in the sector. Clause 22 also requires consultation before commencement in any sector. As such, we think the best approach is to use powers in part 1 of the Bill to implement smart data schemes that fit the identified needs of the relevant sector. The strategy document will set out the Government’s plans for doing so. For that reason, I ask the hon. Lady to withdraw her new clause.

Ben Spencer Portrait Dr Ben Spencer (Runnymede and Weybridge) (Con)
- Hansard - -

It is a pleasure to serve under your chairmanship, Mr Turner, and I thank all hon. Members taking part in the Committee as well as the officials. As the Minister said, this is the third iteration of this Bill and it has been extensively covered in Committee before. We rely on and thank former Members and those in the other place who worked on the Bill to get it to where it is. I am pleased that the Government are taking the Bill forward and that it is one of the early Bills in the Session.

There is much to say about the Bill that is positive, and not just because it is a reformed version of our previous two Bills. Although, ironically, the Bill does not reference the term “smart data”, clause 1 brings forward smart data and smart data schemes. That will help to open up a digital revolution, which will build on the successes of open banking in other sectors. We very much support that.

Victoria Collins Portrait Victoria Collins (Harpenden and Berkhamsted) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairmanship, Mr Turner. The Liberal Democrats very much support the Bill and the move towards smart data. Every single day, millions of people in the UK unknowingly generate vast amounts of data, whether they are switching energy providers, checking their bank balance or simply browsing the internet. That is why I want to speak to new clause 15.

For the past decade, we have seen the enormous benefits of open banking, which has given customers the power to securely share their financial data with new providers. That has unlocked better deals, personalised financial data and a wave of innovation. I welcome what the Minister said about a strategy, but new clause 15 explicitly seeks to extend the benefits across multiple sectors, from energy to telecoms and beyond, giving consumers and small businesses a real say in how their data is used and the chance to benefit from that.

If Linda, a business owner in Tring, wants to switch to a cheaper energy provider or broadband deal, she faces a mountain of admin and endless calls to suppliers. She has no simple way of exporting her usage data and instantly comparing deals. But what if she did? A multi-sector consumer data right, as proposed by the new clause, would give Linda the ability to export her energy usage securely to a new provider. She could use a digital tool to automatically compare plans, switch to a greener provider and save thousands in operational costs, freeing up her focus for growing a business.

However, it is not just Linda and family businesses. New clause 15 would put real power in the hands of households struggling with the cost of living crisis—an ability to break free from restrictive contracts, find better deals and ultimately reduce bills. This is not just a radical idea: Australia has already implemented the consumer data right across finance, energy and telecoms, leading to an explosion of new services, better competition and savings for consumers. The European Union is moving in that direction, yet in the UK we have not taken that step. However, I accept what the Minister said about our strategy moving forward, which I very much welcome.

New clause 15 does not demand an overnight change. It would require the road map to be published in 12 months and to ensure that technical standards are in place and data sharing is secure and efficient. It includes a phased implementation plan to bring in new sectors gradually as well as consumer protection measures so that is done safely and fairly, with public trust at its core. This is not just about giving consumers more control over their data. It is about driving economic growth and innovation. If we get this right, we can see new fintech and comparison tools so that consumers can slash bills and switch telecom providers faster and more easily. It is about more competition, more choice and more innovation. I urge colleagues to consider the new clause, but I absolutely welcome what the Minister has said. Let us take a step forward and ensure that consumers and businesses have the rights that they deserve over their own data.

--- Later in debate ---
Clause 9 contains safeguards limiting enforcers’ investigatory powers. Those require a warrant for entry to private dwellings, and restrict enforcers’ use of information, safeguarding the privileges of Parliament and legal privilege, and protecting against self-incrimination, except for offences under part 1 of the Bill and perjury. The clause also prevents, subject to exceptions, written or oral statements given in investigations from being used against a person being prosecuted for an offence other than one under this part of the Bill. That reflects section 143(8) of the Data Protection Act 2018. I commend clauses 8 and 9 to the Committee.
Ben Spencer Portrait Dr Spencer
- Hansard - -

We support technical amendments to the Bill to make sure it works properly, but I am intrigued why these amendments are necessary at such a late stage, bearing in mind the multiple layers of scrutiny that the Bill has gone through. Can he explain where he received the feedback about the necessity of the proposed changes?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

As the hon. Gentleman says, these are technical changes, and sometimes we just have to go through it again and again to make sure that we have got things right. Amendment 4, for instance, was simply a matter of working out that the grammar did not really work. Sometimes, it is just a question of filleting, I am afraid, and that is what we have been doing.

Amendment 1 agreed to.

Amendments made: 2, in clause 8, page 13, line 16, after second “specified” insert “documents or”.

This amendment provides that regulations may require enforcers to publish or provide documents as well as information, making the regulation-making powers in relation to enforcers consistent with the powers in relation to decision-makers and interface bodies (under clauses 6(9) and 7(4)(k)). See also Amendments 3 and 5.

Amendment 3, in clause 8, page 13, line 18, leave out “information about” and insert—

“documents or information relating to”.

See the explanatory statement for Amendment 2.

Amendment 4, in clause 8, page 13, line 18, leave out—

“, either generally or in relation to a particular case”.

This amendment leaves out unnecessary words. Power for regulations to make provision generally or in relation to particular cases is conferred by clause 21(1)(a).

Amendment 5, in clause 8, page 13, line 20, leave out “information about” and insert—

“documents or information relating to”.—(Chris Bryant.)

See the explanatory statement for Amendment 2.

Clause 8, as amended, ordered to stand part of the Bill.

Clause 9 ordered to stand part of the Bill.

Clause 10

Financial penalties

--- Later in debate ---
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Again, it might be more convenient if I speak to the clauses first and come back to the amendments, because then it is more self-explanatory, but I may need to speak at greater length here.

Open banking has revolutionised the UK retail banking sector by enhancing competition and introducing innovative services. Establishing a long-term regulatory framework for open banking will pave the way for its future growth, and this framework will rely on the FCA having the powers necessary for effective regulation and oversight. Clause 14 therefore empowers the Treasury to enable or require the FCA to set rules for interface bodies and participants in smart data schemes, ensuring compliance with essential standards. Clause 15 sets out further detail about the regulation-making powers conferred on the Treasury by clause 14.

These provisions create a clear framework for delegating rule-making powers, ensuring effective regulation, proper funding and mechanisms to address misconduct by scheme participants, with clear objectives for the FCA’s oversight of smart data schemes. Regulations may enable or require the FCA to impose interface requirements relating to an interface body, as set out for the smart data powers more broadly in clause 7, and to require fees to be paid by financial services providers to cover interface body costs.

Clause 15 further provides that such regulations must impose certain requirements upon the FCA, including a requirement, so far as is reasonably possible, to exercise functions conferred by the regulations in line with specified purposes, and a requirement that the FCA must have regard to specified matters when exercising such functions. Additionally, regulations under clause 15 may empower or require the FCA to impose requirements on individuals or organisations to review their conduct, to take corrective action and to make redress for loss or damage suffered by others as a result of their conduct.

Clause 16 covers the Treasury’s ability to make regulations enabling the FCA to impose financial penalties and levies. The regulations may require or enable the FCA to set the amount or method for calculating penalties for breaches of FCA interface rules. The regulations must require the FCA to set out its penalties policy, and may specify matters that such a policy must include. Additionally, the Treasury may impose itself, or provide for the FCA to impose, a levy on data holders or third-party recipients of financial services data under the scheme to cover its regulatory costs, with the funds being used as specified in the regulations. Only those capable of being directly impacted should be subject to the levy.

Penalties and levies are a necessary part of smart data schemes, including in financial services, to allow the FCA to penalise non-compliance and recover the costs of its regulatory activities. The clause ensures that any penalties or levies are subject to proportionate controls.

Clause 17 gives the Treasury the power to amend section 98 of the Financial Services (Banking Reform) Act 2013 through regulations. This will allow the Treasury to update the definitions of the FCA’s responsibilities and objectives in that section, so they can include new functions or objectives given to the FCA by regulations made under part 1 of this Bill. That will ensure that the FCA’s new duties fit into the existing system for co-ordinating payment system regulators, helping maintain a consistent approach across the financial sector. Regulations made under the clause will be subject to the affirmative procedure.

We have tabled Government amendments 7 to 9 to ensure that the Treasury may delegate to the FCA powers to set rules for action initiation, as well as data sharing. We think this is vital to ensure that open banking continues to work properly and is in line with the policy as set out elsewhere.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I apologise, Mr Turner: I misspoke earlier with regard to our position on the Government amendments. Rather than offering positive support, I meant to say that we will not oppose the technical amendments.

What does the FCA think about these amendments? Has the Department consulted the FCA?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am not sure whether we have specifically—I am looking to my left for inspiration. I am getting vague inspiration, although it is remarkably non-productive. If the hon. Member would like to intervene for a little longer, perhaps I will be able to be more inspired.

--- Later in debate ---
Ben Spencer Portrait Dr Spencer
- Hansard - -

I thank the Minister for giving way. I appreciate that it is a technical question and I hope he is able to give a response. Equally, I appreciate that he may have to write to me in due course. I see that there are papers coming his way.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

To quote Richard II, methinks I am a prophet new inspired. Yes, this is all based on a consultation with the FCA. The FCA is content with us proceeding in this direction. I hope that, on that basis, the shadow Minister—I am trying to differentiate between his not opposing and supporting, but I think on the whole in Parliament, if you are not against us, you are for us. I think in this measure he is for us.

--- Later in debate ---
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Part 2 of the Bill is about digital verification services. Those are obviously a very important part of the Bill; they lay out how we want to move into a new era and they are essential to many businesses being able to deliver their services effectively. They are also important to the Government being able to deliver some of the things we hope for—in terms of greater productivity in the delivery of services—and, frankly, to turning Government-provided services into services that feel as intuitively available and accessible as those provided by the private sector.

Clause 27 defines digital verification services and sets out the scope of provision in part 2, which runs from clauses 27 to 55, to help secure their reliability. New clause 9, which we will hear about in a few moments, has been tabled by the hon. Member for North Norfolk. It would require organisations to offer non-digital verification services where practicable. The provision would change the voluntary nature of part 2 by imposing new obligations on businesses.

I fully support the idea of digital inclusion, which is why as the digital inclusion Minister I introduced our first action plan last week; we are the first Government to bring one forward in 10 years. However, we believe that the new clause is unnecessary because we are already prioritising digital inclusion. The office for digital identities and attributes will monitor the inclusivity of certified services, and include findings in the annual report that must be published under clause 53, which we will come to later.

In addition, there are already legislative protections in the Equality Act 2010 for protected groups. If in future the Government find evidence suggesting that regulatory intervention is appropriate to ensure that individuals have equal access to services across the economy, then we will consider appropriate intervention. I reassure the House that digital inclusion is a high priority for the Government, which is why we have set up the digital inclusion and skills unit within the Department for Science, Innovation and Technology, and why just last week we published the digital inclusion action plan, setting out the first five immediate steps we are taking towards our ambition of delivering digital inclusion for everyone across the UK, regardless of their circumstances.

We want to be able to deliver as many services digitally as possible, in a way that is fully accessible to people. However, we also accept that many people are not engaged in the digital world, and that there must also be provision for them. For those reasons, I hope the hon. Member for North Norfolk feels comfortable not pressing his new clause to a vote.

Ben Spencer Portrait Dr Spencer
- Hansard - -

Digital verification services are important, and will make a big change when rolled out as part of this legislation. The provision is entirely right, particularly on the proportionality of data disclosure. Reading through some of the various reports and briefings we have received, the example used is of someone going into a nightclub: why should a scanned copy of their driving licence be consumed and contained by whoever the data holder is, when all they need to do is prove their age? These services will open the door to allow the proportionate disclosure of data. There is a both a data assurance component and a section on privacy, so we are glad that the Government are taking these measures forward.

I sympathise with the intention of new clause 9, in the name of the hon. Member for North Norfolk, which is to make sure that we do everything we can to support people who are digitally excluded. That ensures that people are not locked out and that there is a degree of reciprocity, so that as we digitalise more, the opportunity remains for people to access non-digital base services. I am not sure about the scope of the binding duty in the provision and about how the duties on small providers, as opposed to a duty on public service providers, play out politically. I think those are different things. Nevertheless, I support the sentiment of the new clause.

Steff Aquarone Portrait Steff Aquarone (North Norfolk) (LD)
- Hansard - - - Excerpts

It is a pleasure to serve under your chairship, Mr Turner. Don’t get me wrong: there are huge opportunities to improve the seamlessness of services for all users, regardless of whether they access those services digitally or not. Through new clause 9, I want to establish a right for those who do not wish to or cannot use digital identification within the verification framework that the Bill creates. The amendment was also tabled in Committee in the other place by the noble Lord Clement-Jones, and I am pleased to bring it before this House, too.

--- Later in debate ---
None Portrait The Chair
- Hansard -

Before I call the shadow Minister, I want to clarify that amendment 11 is in this group, but a decision on it will be taken when we get to clause 45.

Ben Spencer Portrait Dr Spencer
- Hansard - -

Amendments 10 and 11 seek to remove certain provisions that were introduced in Committee in the other place. I thank Sex Matters for its work, but also many people in this policy area who have tried to focus on the importance of data accuracy and validity when it is used.

I hope we all agree that it is important that data, when it is collected—in fact it is a principle of data collection and maintenance—is accurate and correct and that there is no point holding or using data if it is incorrect. Biased data is worse than no data at all. Therefore, I do not understand—especially given the extra use of the data that will come as part of digital verification services—why the Minister and the Government are not keen on the provision to stipulate that public bodies that hold sensitive data should be certain of its accuracy, particularly when the data is going to be passed on and used as part of digital verification services. I am confused by the resistance to ensuring that the data is correct, particularly when we anticipate that it will be used as part of a far bigger spectrum. It will be consumed by a digital verification service in which it is not routine to go back and look at the original paper records. The only dataset to be relied on will be some Oracle Excel spreadsheet or whatever database is used by public authorities.

This debate has become more acute with regard to the importance of sex data. It is critical that sex data is available to protect public spaces and to be used in scientific research to allocate someone’s sex as part of medicine and healthcare. I speak as a former doctor, and I guess I should declare an interest in that I am married to a doctor. The use of sex data is critical in medical screening programmes, such as cervical screening and prostate screening, to understand and interpret investigations. It is critical that the data is accurate; otherwise, there is a danger that research will not be appropriate or will produce bad results, and there is also a potential degree of medical harm. It is critical that we get sex data correct when it is being used.

I do not agree with the argument that requiring the disclosure of sex data is either disproportionate or somehow a breach of the European convention on human rights. The whole point of digital verification services is proportionate disclosure. In fact, we have heard speeches from both sides of the Committee about proportionate disclosure, and limiting the amount of personal data that is passed on as part of a digital verification service.

My challenge is, quite simply, that if somebody is collecting sex data as part of a verification system, why are they doing so? If they do not need to know what someone’s sex is, it should not be collected. Digital verification services allow people to choose their proportionate disclosure. There will be times when sex data is required for renting a property—that example has been used before—because people may want to rent properties in single-sex accommodation. I may argue that is a proportionate disclosure. If it is a standard rental property in another situation, it is probably a non-proportionate disclosure. Another argument has been made that it is needed to triangulate data to verify ID. Again, that does not seem to work, because the whole point of a digital verification service is to allow someone to have a digital ID framework and use different points to verify.

The perversity of this debate is that these schemes and their proportionate disclosure protect people’s identities. They protect people from non-disproportionate disclosure. We need to make sure that the data we are using is accurate and correct, and that it says what we want it to say when someone is inquiring about somebody’s sex. If somebody is asking for sex data but they do not need it, people should be able to say no, which the existing provisions allow for.

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

No, they don’t.

Ben Spencer Portrait Dr Spencer
- Hansard - -

What is the point of politics if we do not have a debate? We strongly disagree with the interpretation that the provisions are somehow incompatible with ECHR rights. They totally support people’s privacy rights under article 8 regarding proportionate disclosures. If somebody needs to have someone’s sex data, they need sex data. They do not need gender data. The provisions allow for it, and if somebody does not need sex data, they should not be collecting it in the first place.

Joe Robertson Portrait Joe Robertson (Isle of Wight East) (Con)
- Hansard - - - Excerpts

It is an honour to serve under your chairmanship, Mr Turner.

Further to the comments made by my hon. Friend the Member for Runnymede and Weybridge, does the Minister at least accept that the Bill poses a risk of entrenching inaccurate data relating to sex through public bodies using DVS systems? Notwithstanding his views on the Lords amendments, could he address that point? What steps will the Government take to ensure the reliability of sex data to ensure protection, such as of women using female-only spaces? What will the Minister do to ensure that inaccurate data entrenched by the Bill will not pose a risk to people in those situations and others? I am thinking, of course, of services available in healthcare, but that is by no means the only example.

--- Later in debate ---
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

The clause creates a permissive information gateway. This will enable public authorities to share information relating to an individual with registered digital verification services, when requested by the individual. The gateway enables digital identity checks to be made against public authority data, thereby increasing the trustworthiness of identity and eligibility checks across the economy.

Clause 45 also makes it clear that the power does not authorise disclosure of information that would breach the data protection legislation or the Investigatory Powers Act 2016. However, disclosure of information under the clause would not breach any obligations of confidence owed by the public authority or any other restrictions on the disclosure of the information. The clause also enables public authorities to charge a fee for the disclosure of information under the clause.

Ben Spencer Portrait Dr Spencer
- Hansard - -

I am not going to rehash the previous debates. Clearly, the Committee has made its decision, no matter how disappointing that is. I just wanted to pick up the Minister’s previous point about the use of common sense in arbitration decisions when it comes to access to protected same-sex spaces. I fully support using common sense, but how does that play out in a situation where somebody has gone through a digital verification service that has used data that is held by a local authority, but that has been changed at a later date—that is, in effect, gender data? How will that be resolved?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I think that I will have to write to the hon. Gentleman. We have agreed the amendment, so that is slightly rehashing the debate. I am happy to write to him and he will have that before we come back for Thursday’s Committee sitting.

Question put and agreed to.

Clause 45, as amended, accordingly ordered to stand part of the Bill.

Clause 46

Information disclosed by the Revenue and Customs

Question proposed, That the clause stand part of the Bill.

--- Later in debate ---
Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

I am sure Members were wondering when we were going to get to a code of practice, and this is the clause that introduces it. Clause 49 requires the Secretary of State to prepare and publish a code of practice for the disclosure of information under the information gateway created in clause 45. The code of practice will provide guidance and best practice for such disclosure, including what information should be shared, who it should be shared with and how to share it securely.

In preparing and revising the code, the Secretary of State must consult with the Information Commissioner, devolved Governments and other appropriate persons. The code will be laid before Parliament before it is finalised. The first version of the code will be subject to the affirmative procedure and subsequent versions to the negative procedure, allowing proper parliamentary scrutiny.

Ben Spencer Portrait Dr Spencer
- Hansard - -

Will the code of practice include information on the proportionate disclosure of data through the DVS scheme?

Chris Bryant Portrait Chris Bryant
- Hansard - - - Excerpts

Yes.

Question put and agreed to.

Clause 49 accordingly ordered to stand part of the Bill.

Clause 50

Trust mark for use by registered persons

Question proposed, That the clause stand part of the Bill.